Mar 17 01:09:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 17 01:09:33 crc restorecon[4592]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:33 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 01:09:34 crc restorecon[4592]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 01:09:34 crc kubenswrapper[4735]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.824399 4735 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.832636 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833064 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833075 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833086 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833096 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833107 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833115 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833123 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833133 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833141 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833150 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833157 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833165 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833173 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833180 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833188 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833196 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833204 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833211 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833223 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833234 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833243 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833252 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833261 4735 feature_gate.go:330] unrecognized feature gate: Example Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833269 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833277 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833285 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833292 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833300 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833307 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833315 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833323 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833330 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833340 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833349 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833358 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833367 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833375 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833384 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833427 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833435 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833443 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833465 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833473 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833481 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833489 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833497 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833504 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833512 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833519 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833527 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833537 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833546 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833555 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833563 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833570 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833578 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833586 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833594 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833601 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833609 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833619 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833628 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833635 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833642 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833650 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833657 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833665 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833673 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833680 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.833687 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833886 4735 flags.go:64] FLAG: --address="0.0.0.0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833904 4735 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833919 4735 flags.go:64] FLAG: --anonymous-auth="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833930 4735 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833942 4735 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833951 4735 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833962 4735 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833976 4735 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833985 4735 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.833994 4735 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834003 4735 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834013 4735 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834022 4735 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834031 4735 flags.go:64] FLAG: --cgroup-root="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834040 4735 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834049 4735 flags.go:64] FLAG: --client-ca-file="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834058 4735 flags.go:64] FLAG: --cloud-config="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834067 4735 flags.go:64] FLAG: --cloud-provider="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834076 4735 flags.go:64] FLAG: --cluster-dns="[]" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834087 4735 flags.go:64] FLAG: --cluster-domain="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834096 4735 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834104 4735 flags.go:64] FLAG: --config-dir="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834113 4735 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834123 4735 flags.go:64] FLAG: --container-log-max-files="5" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834134 4735 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834143 4735 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834152 4735 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834161 4735 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834170 4735 flags.go:64] FLAG: --contention-profiling="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834179 4735 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834188 4735 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834199 4735 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834207 4735 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834218 4735 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834227 4735 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834236 4735 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834245 4735 flags.go:64] FLAG: --enable-load-reader="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834254 4735 flags.go:64] FLAG: --enable-server="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834263 4735 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834274 4735 flags.go:64] FLAG: --event-burst="100" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834283 4735 flags.go:64] FLAG: --event-qps="50" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834292 4735 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834300 4735 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834310 4735 flags.go:64] FLAG: --eviction-hard="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834321 4735 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834330 4735 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834341 4735 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834351 4735 flags.go:64] FLAG: --eviction-soft="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834360 4735 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834369 4735 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834379 4735 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834387 4735 flags.go:64] FLAG: --experimental-mounter-path="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834396 4735 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834405 4735 flags.go:64] FLAG: --fail-swap-on="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834414 4735 flags.go:64] FLAG: --feature-gates="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834424 4735 flags.go:64] FLAG: --file-check-frequency="20s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834433 4735 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834443 4735 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834451 4735 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834461 4735 flags.go:64] FLAG: --healthz-port="10248" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834469 4735 flags.go:64] FLAG: --help="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834479 4735 flags.go:64] FLAG: --hostname-override="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834487 4735 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834496 4735 flags.go:64] FLAG: --http-check-frequency="20s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834504 4735 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834515 4735 flags.go:64] FLAG: --image-credential-provider-config="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834523 4735 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834532 4735 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834541 4735 flags.go:64] FLAG: --image-service-endpoint="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834549 4735 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834558 4735 flags.go:64] FLAG: --kube-api-burst="100" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834567 4735 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834576 4735 flags.go:64] FLAG: --kube-api-qps="50" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834585 4735 flags.go:64] FLAG: --kube-reserved="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834594 4735 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834603 4735 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834612 4735 flags.go:64] FLAG: --kubelet-cgroups="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834620 4735 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834629 4735 flags.go:64] FLAG: --lock-file="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834639 4735 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834649 4735 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834659 4735 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834672 4735 flags.go:64] FLAG: --log-json-split-stream="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834681 4735 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834689 4735 flags.go:64] FLAG: --log-text-split-stream="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834698 4735 flags.go:64] FLAG: --logging-format="text" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834707 4735 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834716 4735 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834725 4735 flags.go:64] FLAG: --manifest-url="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834733 4735 flags.go:64] FLAG: --manifest-url-header="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834745 4735 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834754 4735 flags.go:64] FLAG: --max-open-files="1000000" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834764 4735 flags.go:64] FLAG: --max-pods="110" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834773 4735 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834782 4735 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834790 4735 flags.go:64] FLAG: --memory-manager-policy="None" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834800 4735 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834809 4735 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834817 4735 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834826 4735 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834851 4735 flags.go:64] FLAG: --node-status-max-images="50" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834887 4735 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834896 4735 flags.go:64] FLAG: --oom-score-adj="-999" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834905 4735 flags.go:64] FLAG: --pod-cidr="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834914 4735 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834932 4735 flags.go:64] FLAG: --pod-manifest-path="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834944 4735 flags.go:64] FLAG: --pod-max-pids="-1" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834955 4735 flags.go:64] FLAG: --pods-per-core="0" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834966 4735 flags.go:64] FLAG: --port="10250" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834978 4735 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.834990 4735 flags.go:64] FLAG: --provider-id="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835003 4735 flags.go:64] FLAG: --qos-reserved="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835014 4735 flags.go:64] FLAG: --read-only-port="10255" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835025 4735 flags.go:64] FLAG: --register-node="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835035 4735 flags.go:64] FLAG: --register-schedulable="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835045 4735 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835062 4735 flags.go:64] FLAG: --registry-burst="10" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835071 4735 flags.go:64] FLAG: --registry-qps="5" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835081 4735 flags.go:64] FLAG: --reserved-cpus="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835090 4735 flags.go:64] FLAG: --reserved-memory="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835100 4735 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835115 4735 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835124 4735 flags.go:64] FLAG: --rotate-certificates="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835133 4735 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835141 4735 flags.go:64] FLAG: --runonce="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835150 4735 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835159 4735 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835168 4735 flags.go:64] FLAG: --seccomp-default="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835177 4735 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835185 4735 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835195 4735 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835204 4735 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835246 4735 flags.go:64] FLAG: --storage-driver-password="root" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835256 4735 flags.go:64] FLAG: --storage-driver-secure="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835266 4735 flags.go:64] FLAG: --storage-driver-table="stats" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835274 4735 flags.go:64] FLAG: --storage-driver-user="root" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835283 4735 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835292 4735 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835301 4735 flags.go:64] FLAG: --system-cgroups="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835310 4735 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835324 4735 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835333 4735 flags.go:64] FLAG: --tls-cert-file="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835343 4735 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835355 4735 flags.go:64] FLAG: --tls-min-version="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835364 4735 flags.go:64] FLAG: --tls-private-key-file="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835373 4735 flags.go:64] FLAG: --topology-manager-policy="none" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835382 4735 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835393 4735 flags.go:64] FLAG: --topology-manager-scope="container" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835402 4735 flags.go:64] FLAG: --v="2" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835414 4735 flags.go:64] FLAG: --version="false" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835425 4735 flags.go:64] FLAG: --vmodule="" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835435 4735 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.835445 4735 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835669 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835682 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835692 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835701 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835709 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835718 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835726 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835735 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835743 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835751 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835759 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835767 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835775 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835782 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835790 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835798 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835806 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835813 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835821 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835829 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835836 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835844 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835851 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835887 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835895 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835903 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835912 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835921 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835929 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835936 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835944 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835952 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835960 4735 feature_gate.go:330] unrecognized feature gate: Example Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835967 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835976 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835983 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.835994 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836002 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836010 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836018 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836025 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836033 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836042 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836049 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836060 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836069 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836078 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836087 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836095 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836103 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836111 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836118 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836126 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836134 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836141 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836149 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836156 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836173 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836181 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836189 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836197 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836204 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836212 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836221 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836228 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836235 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836243 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836252 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836260 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836268 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.836277 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.836290 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.850427 4735 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.850469 4735 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850620 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850648 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850657 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850666 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850677 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850688 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850697 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850707 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850715 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850724 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850733 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850741 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850749 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850757 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850765 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850773 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850781 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850790 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850797 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850805 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850815 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850826 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850835 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850843 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850851 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850887 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850896 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850904 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850913 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850923 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850934 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850949 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850964 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850974 4735 feature_gate.go:330] unrecognized feature gate: Example Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850987 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.850998 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851008 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851019 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851030 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851039 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851047 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851055 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851063 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851071 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851079 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851086 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851094 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851101 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851109 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851117 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851124 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851133 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851141 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851149 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851157 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851167 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851178 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851187 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851196 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851205 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851212 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851220 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851228 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851235 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851243 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851252 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851259 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851267 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851275 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851283 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851292 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.851305 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851592 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851612 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851621 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851631 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851640 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851649 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851657 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851667 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851676 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851685 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851693 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851702 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851712 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851720 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851728 4735 feature_gate.go:330] unrecognized feature gate: Example Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851736 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851744 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851752 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851760 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851771 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851781 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851790 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851799 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851809 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851820 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851828 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851837 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851846 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851901 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851910 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851918 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851926 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851933 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851941 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851951 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851959 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851967 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851975 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851983 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851991 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.851999 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852007 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852014 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852022 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852031 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852039 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852047 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852055 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852062 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852070 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852078 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852086 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852094 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852101 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852108 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852117 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852125 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852132 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852142 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852151 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852161 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852169 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852178 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852186 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852193 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852203 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852212 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852220 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852228 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852236 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.852245 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.852257 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.853300 4735 server.go:940] "Client rotation is on, will bootstrap in background" Mar 17 01:09:34 crc kubenswrapper[4735]: E0317 01:09:34.858823 4735 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.864743 4735 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.865209 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.868418 4735 server.go:997] "Starting client certificate rotation" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.868484 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.868677 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.895445 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.898368 4735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 01:09:34 crc kubenswrapper[4735]: E0317 01:09:34.902189 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.913168 4735 log.go:25] "Validated CRI v1 runtime API" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.946062 4735 log.go:25] "Validated CRI v1 image API" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.948547 4735 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.955460 4735 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-17-01-04-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.955544 4735 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.970913 4735 manager.go:217] Machine: {Timestamp:2026-03-17 01:09:34.96774637 +0000 UTC m=+0.599979368 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4bab9c2b-e779-4cf2-8464-6eca29daaf6c BootID:44ff1d0f-f4fc-4cb2-a632-dc715bf02555 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:20:41:cb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:20:41:cb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ac:1d:d1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ac:26:33 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:36:cd:e4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:41:ec:4e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:4e:1c:ea:02:11 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:b4:63:55:45:5b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.971271 4735 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.971484 4735 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.974754 4735 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.975154 4735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.975232 4735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.976290 4735 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.976321 4735 container_manager_linux.go:303] "Creating device plugin manager" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.976974 4735 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.977008 4735 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.977398 4735 state_mem.go:36] "Initialized new in-memory state store" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.977512 4735 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.982247 4735 kubelet.go:418] "Attempting to sync node with API server" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.982285 4735 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.982331 4735 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.982349 4735 kubelet.go:324] "Adding apiserver pod source" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.982366 4735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.986784 4735 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.987996 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:34 crc kubenswrapper[4735]: W0317 01:09:34.988006 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:34 crc kubenswrapper[4735]: E0317 01:09:34.988161 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:34 crc kubenswrapper[4735]: E0317 01:09:34.988083 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.988482 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.991359 4735 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993212 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993269 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993284 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993297 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993328 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993344 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993358 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993386 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993410 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993430 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993452 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.993468 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.994915 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.995841 4735 server.go:1280] "Started kubelet" Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.997595 4735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.997925 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.997600 4735 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 01:09:34 crc kubenswrapper[4735]: I0317 01:09:34.998742 4735 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 01:09:34 crc systemd[1]: Started Kubernetes Kubelet. Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.001878 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.002053 4735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.002187 4735 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.002202 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.002217 4735 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.002417 4735 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.004359 4735 server.go:460] "Adding debug handlers to kubelet server" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.005165 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.005476 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.005568 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.006542 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d7ba49142f2c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,LastTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.009026 4735 factory.go:55] Registering systemd factory Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.009070 4735 factory.go:221] Registration of the systemd container factory successfully Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.010510 4735 factory.go:153] Registering CRI-O factory Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.010531 4735 factory.go:221] Registration of the crio container factory successfully Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.010595 4735 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.010615 4735 factory.go:103] Registering Raw factory Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.010628 4735 manager.go:1196] Started watching for new ooms in manager Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.011218 4735 manager.go:319] Starting recovery of all containers Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018590 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018685 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018720 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018746 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018776 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018796 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018814 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018840 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018893 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018926 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018952 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.018980 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019000 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019034 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019054 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019081 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019101 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019121 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019146 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019205 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019227 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019255 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019279 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019318 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019339 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019359 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019394 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019424 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019442 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019471 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019490 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019518 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019750 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.019773 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020172 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020192 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020224 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020242 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020393 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020422 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020666 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.020735 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.024556 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.024619 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.024652 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.024735 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.024773 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028356 4735 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028473 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028509 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028535 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028555 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028579 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028618 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028692 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028720 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028743 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028764 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028784 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028805 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028849 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028899 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028919 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028943 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028965 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.028985 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029004 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029028 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029051 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029076 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029097 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029121 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029144 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029198 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029224 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029243 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029262 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029289 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029309 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029355 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029374 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029395 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029414 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029433 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029456 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029476 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029495 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029515 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029536 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029555 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029575 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029597 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029617 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029637 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029663 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029682 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029702 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029721 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029741 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029763 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029785 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029803 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029824 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029908 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029955 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.029997 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030022 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030045 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030069 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030101 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030130 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030159 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030181 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030205 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030227 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030249 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030301 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030326 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030349 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030373 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030397 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030421 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030442 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030494 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030519 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030543 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030569 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030592 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030612 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030634 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030654 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030677 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030703 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030725 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030747 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030767 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030790 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030813 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030838 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030888 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030910 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030932 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030978 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.030999 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031021 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031043 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031067 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031088 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031186 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031213 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031238 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031260 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031284 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031305 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031326 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031347 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031370 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031392 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031419 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031442 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031462 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031486 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031507 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031528 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031551 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031574 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031597 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031621 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031645 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031668 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031689 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031709 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031730 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031750 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031771 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031790 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031813 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031833 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031881 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031903 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031922 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031942 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031965 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.031985 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032006 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032025 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032047 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032069 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032090 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032110 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032133 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032152 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032172 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032195 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032218 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032242 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032264 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032289 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032309 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032385 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032411 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032434 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032453 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032472 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032491 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032513 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032536 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032557 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032576 4735 reconstruct.go:97] "Volume reconstruction finished" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.032591 4735 reconciler.go:26] "Reconciler: start to sync state" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.048474 4735 manager.go:324] Recovery completed Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.057308 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.059162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.059281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.059365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.060426 4735 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.060501 4735 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.060606 4735 state_mem.go:36] "Initialized new in-memory state store" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.069625 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.071681 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.071748 4735 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.071792 4735 kubelet.go:2335] "Starting kubelet main sync loop" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.071950 4735 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.072980 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.073035 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.075447 4735 policy_none.go:49] "None policy: Start" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.076688 4735 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.076787 4735 state_mem.go:35] "Initializing new in-memory state store" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.102946 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.121837 4735 manager.go:334] "Starting Device Plugin manager" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.121942 4735 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.121958 4735 server.go:79] "Starting device plugin registration server" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.122551 4735 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.122568 4735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.123955 4735 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.124388 4735 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.124401 4735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.133323 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.172180 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.172367 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.173894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.173944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.173956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174112 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174390 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.174974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.175141 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.175575 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.175614 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.176532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177284 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177460 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.177533 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.178726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.178773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.178791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.179166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.179200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.179218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.179362 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.180178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.180243 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.181596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.181626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.181637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.183074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.183103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.183116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.183329 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.183365 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.187766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.187811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.187822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.206318 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.210603 4735 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/cpuset.cpus.effective": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/cpuset.cpus.effective: no such device Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.223315 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.226469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.226501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.226511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.226539 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.226993 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234584 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234642 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234770 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.234976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.235018 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.235038 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.235070 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.235090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.235207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336814 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.336996 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337042 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337083 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337385 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337464 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337594 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.337780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.427839 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.430099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.430175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.430204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.430256 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.431039 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.506759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.523446 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.539788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.549178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.552328 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.573242 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3f8a67a063b88bdb9a862d07404e46fb1ff99130dfa348aaca1128664eee292d WatchSource:0}: Error finding container 3f8a67a063b88bdb9a862d07404e46fb1ff99130dfa348aaca1128664eee292d: Status 404 returned error can't find the container with id 3f8a67a063b88bdb9a862d07404e46fb1ff99130dfa348aaca1128664eee292d Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.592515 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d917c00f6255a3d7a4c6f31a4ba0579a5f5efa8c1fdaff4b749a3a136705c358 WatchSource:0}: Error finding container d917c00f6255a3d7a4c6f31a4ba0579a5f5efa8c1fdaff4b749a3a136705c358: Status 404 returned error can't find the container with id d917c00f6255a3d7a4c6f31a4ba0579a5f5efa8c1fdaff4b749a3a136705c358 Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.594304 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0144ab37640a1f153ca5ca41fd2f280917a14c6a77132c2b5639edfee8e3b6e5 WatchSource:0}: Error finding container 0144ab37640a1f153ca5ca41fd2f280917a14c6a77132c2b5639edfee8e3b6e5: Status 404 returned error can't find the container with id 0144ab37640a1f153ca5ca41fd2f280917a14c6a77132c2b5639edfee8e3b6e5 Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.607562 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.831913 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.833925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.833985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.833996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.834033 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.834629 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 17 01:09:35 crc kubenswrapper[4735]: W0317 01:09:35.967498 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:35 crc kubenswrapper[4735]: E0317 01:09:35.967641 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:35 crc kubenswrapper[4735]: I0317 01:09:35.999179 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:36 crc kubenswrapper[4735]: W0317 01:09:36.020256 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.020425 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.076995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9210c2b00301f1c0192a47738266a2a211032b0e6385f9a1ffb4c7f7c432b09"} Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.079544 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e8e8cbab959f4bc50c6437df8a6d19e719859b31a5bfc73cd74e057caefc2291"} Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.084464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d917c00f6255a3d7a4c6f31a4ba0579a5f5efa8c1fdaff4b749a3a136705c358"} Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.087827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0144ab37640a1f153ca5ca41fd2f280917a14c6a77132c2b5639edfee8e3b6e5"} Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.090475 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3f8a67a063b88bdb9a862d07404e46fb1ff99130dfa348aaca1128664eee292d"} Mar 17 01:09:36 crc kubenswrapper[4735]: W0317 01:09:36.211087 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.211174 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:36 crc kubenswrapper[4735]: W0317 01:09:36.339632 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.339752 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.409051 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.636678 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.643886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.643944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.643968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.644020 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.644992 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.947179 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 01:09:36 crc kubenswrapper[4735]: E0317 01:09:36.948589 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:36 crc kubenswrapper[4735]: I0317 01:09:36.998752 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.098503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.098570 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.098585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.100561 4735 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87" exitCode=0 Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.101001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.101269 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.103275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.103308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.103322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.104303 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258" exitCode=0 Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.104355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.104334 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.106594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.107012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.107310 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.110474 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a6941015464448814924fb4ef34bb3f345d0af65321d87abbe09fce44e6cb0e" exitCode=0 Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.110614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a6941015464448814924fb4ef34bb3f345d0af65321d87abbe09fce44e6cb0e"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.110798 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.113241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.113363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.113466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.114201 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2" exitCode=0 Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.114275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2"} Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.114439 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.115749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.115928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.115969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.119199 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.121040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.121110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.121131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:37 crc kubenswrapper[4735]: I0317 01:09:37.998777 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:38 crc kubenswrapper[4735]: E0317 01:09:38.010820 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.125949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.126002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.126013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.128846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.128986 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.135941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.135971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.135981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.137372 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.137437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.137452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.137576 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.138525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.138585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.138605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.143817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5a8332dbfb1555aad08f74de3fd8d3ce4bc6fe8a386576a0a3383c06ba44c64c"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.143997 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.145048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.145083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.145103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.147812 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bfb71e4914f698504cecd637728dd2b2d92a11b2690d62e8ff2cba234a99d425" exitCode=0 Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.147871 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bfb71e4914f698504cecd637728dd2b2d92a11b2690d62e8ff2cba234a99d425"} Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.148023 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.149238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.149266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.149277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.245330 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.248403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.248741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.248784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:38 crc kubenswrapper[4735]: I0317 01:09:38.248832 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:38 crc kubenswrapper[4735]: E0317 01:09:38.251404 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Mar 17 01:09:38 crc kubenswrapper[4735]: W0317 01:09:38.347547 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:38 crc kubenswrapper[4735]: E0317 01:09:38.347678 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:38 crc kubenswrapper[4735]: W0317 01:09:38.619545 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Mar 17 01:09:38 crc kubenswrapper[4735]: E0317 01:09:38.619822 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.021942 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.158658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"12c850965b24023f4ae6d0b124a04acd56a2c25dc913a0ea2f88db113d6c5657"} Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.158740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b"} Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.158847 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.160466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.160537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.160564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.163127 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="715eabeab7af3adc41640639b42add06b605845c1aa0cce44b29d13566869989" exitCode=0 Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.163256 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.163290 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.163301 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.163355 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.164040 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.164038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"715eabeab7af3adc41640639b42add06b605845c1aa0cce44b29d13566869989"} Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.164657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.164713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.164738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.165707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.167578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.167802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.167994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:39 crc kubenswrapper[4735]: I0317 01:09:39.511956 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.172184 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.172977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e465925486e197e4a0c1bf0a0b6aba1e9ef2951adeb411751318717714ec4714"} Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.173034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fa69219150f999d851b6f41d371147965a3cf96b11bfcb817b0aadc9f6fe8c5"} Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.173059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0c04cbfbb67d0a5b4c5a1f1229ce1fe24935220b3b7e1e75b61e1c1d1b366c3a"} Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.173167 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.173204 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.174342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.174397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.174416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.175557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.175600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.175617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:40 crc kubenswrapper[4735]: I0317 01:09:40.355264 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.184281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a7c58aa0d9399b17e2baf58cc1871926076db25003718c2a29281da916404b4"} Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.184363 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1ad74fef26e7a5d9ffc0c7112c7cff8edfc99f33d97282d92c185bb509f4847c"} Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.184467 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.184598 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.186550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.267273 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.267580 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.269017 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.269157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.269213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.269231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.427690 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.451890 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.453600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.453661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.453679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:41 crc kubenswrapper[4735]: I0317 01:09:41.453730 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.187407 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.188374 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.188692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.188745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.188766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.190124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.190183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:42 crc kubenswrapper[4735]: I0317 01:09:42.190205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.190780 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.192236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.192345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.192365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.390746 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.391589 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.394008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.394079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.394100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.690436 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.690719 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.693091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.693158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:43 crc kubenswrapper[4735]: I0317 01:09:43.693177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:45 crc kubenswrapper[4735]: E0317 01:09:45.133838 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.596229 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.596546 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.598597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.598680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.598699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:46 crc kubenswrapper[4735]: I0317 01:09:46.602425 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.158837 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.205234 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.207484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.207571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.207590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:47 crc kubenswrapper[4735]: I0317 01:09:47.214058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.092824 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.093191 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.095787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.095892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.095913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.208200 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.209835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.209966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.209996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:48 crc kubenswrapper[4735]: W0317 01:09:48.927839 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.928016 4735 trace.go:236] Trace[614702605]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Mar-2026 01:09:38.926) (total time: 10001ms): Mar 17 01:09:48 crc kubenswrapper[4735]: Trace[614702605]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:09:48.927) Mar 17 01:09:48 crc kubenswrapper[4735]: Trace[614702605]: [10.001508512s] [10.001508512s] END Mar 17 01:09:48 crc kubenswrapper[4735]: E0317 01:09:48.928049 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 17 01:09:48 crc kubenswrapper[4735]: I0317 01:09:48.999781 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 17 01:09:49 crc kubenswrapper[4735]: W0317 01:09:49.073183 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.073281 4735 trace.go:236] Trace[642317959]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Mar-2026 01:09:39.072) (total time: 10001ms): Mar 17 01:09:49 crc kubenswrapper[4735]: Trace[642317959]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:09:49.073) Mar 17 01:09:49 crc kubenswrapper[4735]: Trace[642317959]: [10.001146212s] [10.001146212s] END Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.073310 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.210557 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.211618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.211662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.211673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.613473 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d7ba49142f2c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,LastTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:09:49 crc kubenswrapper[4735]: W0317 01:09:49.615127 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.615248 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.616163 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.616261 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 01:09:49 crc kubenswrapper[4735]: W0317 01:09:49.616929 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.617016 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.621048 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.621129 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.623051 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.623619 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 01:09:49 crc kubenswrapper[4735]: E0317 01:09:49.624999 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:49Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.630293 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55968->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 17 01:09:49 crc kubenswrapper[4735]: I0317 01:09:49.630380 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55968->192.168.126.11:17697: read: connection reset by peer" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.004480 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:50Z is after 2026-02-23T05:33:13Z Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.159329 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.159425 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.216684 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.221093 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="12c850965b24023f4ae6d0b124a04acd56a2c25dc913a0ea2f88db113d6c5657" exitCode=255 Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.221156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"12c850965b24023f4ae6d0b124a04acd56a2c25dc913a0ea2f88db113d6c5657"} Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.221368 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.222498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.222538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.222550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:50 crc kubenswrapper[4735]: I0317 01:09:50.223078 4735 scope.go:117] "RemoveContainer" containerID="12c850965b24023f4ae6d0b124a04acd56a2c25dc913a0ea2f88db113d6c5657" Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.001656 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:51Z is after 2026-02-23T05:33:13Z Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.227123 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.229455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5"} Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.229699 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.231198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.231254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:51 crc kubenswrapper[4735]: I0317 01:09:51.231269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.005024 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:52Z is after 2026-02-23T05:33:13Z Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.237971 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.239778 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.243515 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" exitCode=255 Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.243607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5"} Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.243725 4735 scope.go:117] "RemoveContainer" containerID="12c850965b24023f4ae6d0b124a04acd56a2c25dc913a0ea2f88db113d6c5657" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.243925 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.245573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.245621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.245642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:52 crc kubenswrapper[4735]: I0317 01:09:52.246637 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:09:52 crc kubenswrapper[4735]: E0317 01:09:52.247165 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.003596 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:53Z is after 2026-02-23T05:33:13Z Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.248362 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 01:09:53 crc kubenswrapper[4735]: W0317 01:09:53.450493 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:53Z is after 2026-02-23T05:33:13Z Mar 17 01:09:53 crc kubenswrapper[4735]: E0317 01:09:53.450598 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.698828 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.699015 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.700642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.700704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.700762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.701656 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:09:53 crc kubenswrapper[4735]: E0317 01:09:53.702025 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:09:53 crc kubenswrapper[4735]: I0317 01:09:53.708043 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:53 crc kubenswrapper[4735]: W0317 01:09:53.721931 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:53Z is after 2026-02-23T05:33:13Z Mar 17 01:09:53 crc kubenswrapper[4735]: E0317 01:09:53.722031 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.004327 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:54Z is after 2026-02-23T05:33:13Z Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.253033 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.254383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.254437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.254455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:54 crc kubenswrapper[4735]: I0317 01:09:54.255291 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:09:54 crc kubenswrapper[4735]: E0317 01:09:54.255574 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:09:55 crc kubenswrapper[4735]: I0317 01:09:55.003923 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:55Z is after 2026-02-23T05:33:13Z Mar 17 01:09:55 crc kubenswrapper[4735]: E0317 01:09:55.134064 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.003440 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:56Z is after 2026-02-23T05:33:13Z Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.025800 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.027670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.027737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.027755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:56 crc kubenswrapper[4735]: I0317 01:09:56.027797 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:09:56 crc kubenswrapper[4735]: E0317 01:09:56.031444 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 01:09:56 crc kubenswrapper[4735]: E0317 01:09:56.035715 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 01:09:57 crc kubenswrapper[4735]: I0317 01:09:57.003439 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:57Z is after 2026-02-23T05:33:13Z Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.003398 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:58Z is after 2026-02-23T05:33:13Z Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.136982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.137397 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.139188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.139259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.139284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.160057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.263976 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.265217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.265276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.265299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.325761 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 01:09:58 crc kubenswrapper[4735]: E0317 01:09:58.331629 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.958459 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.958733 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.960488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.960557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.960582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:09:58 crc kubenswrapper[4735]: I0317 01:09:58.961645 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:09:58 crc kubenswrapper[4735]: E0317 01:09:58.962012 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:09:59 crc kubenswrapper[4735]: I0317 01:09:59.004820 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:59Z is after 2026-02-23T05:33:13Z Mar 17 01:09:59 crc kubenswrapper[4735]: W0317 01:09:59.059327 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:59Z is after 2026-02-23T05:33:13Z Mar 17 01:09:59 crc kubenswrapper[4735]: E0317 01:09:59.059439 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:09:59 crc kubenswrapper[4735]: E0317 01:09:59.619131 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:09:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d7ba49142f2c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,LastTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.003450 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:00Z is after 2026-02-23T05:33:13Z Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.158685 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.158773 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.356315 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.357138 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.359029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.359075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.359089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:00 crc kubenswrapper[4735]: I0317 01:10:00.359648 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:10:00 crc kubenswrapper[4735]: E0317 01:10:00.359810 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:00 crc kubenswrapper[4735]: W0317 01:10:00.620426 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:00Z is after 2026-02-23T05:33:13Z Mar 17 01:10:00 crc kubenswrapper[4735]: E0317 01:10:00.620520 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:10:01 crc kubenswrapper[4735]: I0317 01:10:01.002336 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:01Z is after 2026-02-23T05:33:13Z Mar 17 01:10:02 crc kubenswrapper[4735]: I0317 01:10:02.004051 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:02Z is after 2026-02-23T05:33:13Z Mar 17 01:10:02 crc kubenswrapper[4735]: W0317 01:10:02.332740 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:02Z is after 2026-02-23T05:33:13Z Mar 17 01:10:02 crc kubenswrapper[4735]: E0317 01:10:02.332852 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.002082 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:03Z is after 2026-02-23T05:33:13Z Mar 17 01:10:03 crc kubenswrapper[4735]: W0317 01:10:03.013947 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:03Z is after 2026-02-23T05:33:13Z Mar 17 01:10:03 crc kubenswrapper[4735]: E0317 01:10:03.014056 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 01:10:03 crc kubenswrapper[4735]: E0317 01:10:03.035257 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.036437 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.037922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.038148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.038207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:03 crc kubenswrapper[4735]: I0317 01:10:03.038285 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:03 crc kubenswrapper[4735]: E0317 01:10:03.041500 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 01:10:04 crc kubenswrapper[4735]: I0317 01:10:04.004056 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:04Z is after 2026-02-23T05:33:13Z Mar 17 01:10:05 crc kubenswrapper[4735]: I0317 01:10:05.003981 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:05Z is after 2026-02-23T05:33:13Z Mar 17 01:10:05 crc kubenswrapper[4735]: E0317 01:10:05.134272 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:06 crc kubenswrapper[4735]: I0317 01:10:06.004032 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.004744 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.530312 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39282->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.530399 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39282->192.168.126.11:10357: read: connection reset by peer" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.530486 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.530680 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.532363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.532430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.532451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.533419 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 17 01:10:07 crc kubenswrapper[4735]: I0317 01:10:07.533724 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95" gracePeriod=30 Mar 17 01:10:08 crc kubenswrapper[4735]: I0317 01:10:08.005906 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:08 crc kubenswrapper[4735]: I0317 01:10:08.298960 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 01:10:08 crc kubenswrapper[4735]: I0317 01:10:08.299719 4735 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95" exitCode=255 Mar 17 01:10:08 crc kubenswrapper[4735]: I0317 01:10:08.299792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95"} Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.006495 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.304159 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.304686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7"} Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.304788 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.305900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.305993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:09 crc kubenswrapper[4735]: I0317 01:10:09.306061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.624936 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba49142f2c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,LastTimestamp:2026-03-17 01:09:34.995780291 +0000 UTC m=+0.628013309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.629704 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.634138 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.638993 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.644688 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4990e13f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.126533105 +0000 UTC m=+0.758766123,LastTimestamp:2026-03-17 01:09:35.126533105 +0000 UTC m=+0.758766123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.651434 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.173916069 +0000 UTC m=+0.806149047,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.658412 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.173952538 +0000 UTC m=+0.806185506,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.665454 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.173960347 +0000 UTC m=+0.806193325,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.671973 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.174944214 +0000 UTC m=+0.807177192,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.681999 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.174970183 +0000 UTC m=+0.807203161,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.686495 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.174979033 +0000 UTC m=+0.807212011,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.691781 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.176516055 +0000 UTC m=+0.808749033,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.697164 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.176529514 +0000 UTC m=+0.808762492,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.704574 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.176561484 +0000 UTC m=+0.808794452,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.712947 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.176573923 +0000 UTC m=+0.808806891,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.719694 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.176581403 +0000 UTC m=+0.808814381,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.724655 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.176599903 +0000 UTC m=+0.808832891,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.731740 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.17714385 +0000 UTC m=+0.809376848,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.738597 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.177160059 +0000 UTC m=+0.809393047,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.746222 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.177172239 +0000 UTC m=+0.809405227,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.752908 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.17876207 +0000 UTC m=+0.810995088,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.760105 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.178785289 +0000 UTC m=+0.811018297,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.764705 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950df290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950df290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059415696 +0000 UTC m=+0.691648674,LastTimestamp:2026-03-17 01:09:35.178801209 +0000 UTC m=+0.811034227,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.771458 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950bb8da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950bb8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.05926985 +0000 UTC m=+0.691502828,LastTimestamp:2026-03-17 01:09:35.179193229 +0000 UTC m=+0.811426237,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.778134 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d7ba4950d0f2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d7ba4950d0f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.059357487 +0000 UTC m=+0.691590465,LastTimestamp:2026-03-17 01:09:35.179211299 +0000 UTC m=+0.811444317,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.791206 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba4b42fc196 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.581725078 +0000 UTC m=+1.213958066,LastTimestamp:2026-03-17 01:09:35.581725078 +0000 UTC m=+1.213958066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.795133 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba4b456bc7b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.584279675 +0000 UTC m=+1.216512663,LastTimestamp:2026-03-17 01:09:35.584279675 +0000 UTC m=+1.216512663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.799201 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4b4ffc2aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.595356842 +0000 UTC m=+1.227589830,LastTimestamp:2026-03-17 01:09:35.595356842 +0000 UTC m=+1.227589830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.803306 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba4b508a650 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.595939408 +0000 UTC m=+1.228172386,LastTimestamp:2026-03-17 01:09:35.595939408 +0000 UTC m=+1.228172386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.806892 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba4b6a9197b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:35.623231867 +0000 UTC m=+1.255464855,LastTimestamp:2026-03-17 01:09:35.623231867 +0000 UTC m=+1.255464855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.811649 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba4da0ede7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.217103999 +0000 UTC m=+1.849336977,LastTimestamp:2026-03-17 01:09:36.217103999 +0000 UTC m=+1.849336977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.818954 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4da24f5fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.218551804 +0000 UTC m=+1.850784782,LastTimestamp:2026-03-17 01:09:36.218551804 +0000 UTC m=+1.850784782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.824988 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba4da274290 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.21870248 +0000 UTC m=+1.850935468,LastTimestamp:2026-03-17 01:09:36.21870248 +0000 UTC m=+1.850935468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.830471 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba4da31d0f4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.219394292 +0000 UTC m=+1.851627270,LastTimestamp:2026-03-17 01:09:36.219394292 +0000 UTC m=+1.851627270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.837416 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba4da625758 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.222574424 +0000 UTC m=+1.854807402,LastTimestamp:2026-03-17 01:09:36.222574424 +0000 UTC m=+1.854807402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.843521 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4daede4a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.231720099 +0000 UTC m=+1.863953087,LastTimestamp:2026-03-17 01:09:36.231720099 +0000 UTC m=+1.863953087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.851305 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba4db072228 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.233374248 +0000 UTC m=+1.865607226,LastTimestamp:2026-03-17 01:09:36.233374248 +0000 UTC m=+1.865607226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.857615 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4db13d5d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.234206678 +0000 UTC m=+1.866439666,LastTimestamp:2026-03-17 01:09:36.234206678 +0000 UTC m=+1.866439666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.865740 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba4db194995 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.234563989 +0000 UTC m=+1.866796957,LastTimestamp:2026-03-17 01:09:36.234563989 +0000 UTC m=+1.866796957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.870253 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba4db2528bb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.235342011 +0000 UTC m=+1.867574979,LastTimestamp:2026-03-17 01:09:36.235342011 +0000 UTC m=+1.867574979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.874248 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba4db3db0cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.236949711 +0000 UTC m=+1.869182689,LastTimestamp:2026-03-17 01:09:36.236949711 +0000 UTC m=+1.869182689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.882756 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4eab53782 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.496441218 +0000 UTC m=+2.128674186,LastTimestamp:2026-03-17 01:09:36.496441218 +0000 UTC m=+2.128674186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.888489 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4eb82f06c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.509923436 +0000 UTC m=+2.142156444,LastTimestamp:2026-03-17 01:09:36.509923436 +0000 UTC m=+2.142156444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.899241 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4eb9a480f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.511453199 +0000 UTC m=+2.143686217,LastTimestamp:2026-03-17 01:09:36.511453199 +0000 UTC m=+2.143686217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.901680 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4fcba20aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.798752938 +0000 UTC m=+2.430985946,LastTimestamp:2026-03-17 01:09:36.798752938 +0000 UTC m=+2.430985946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.906162 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4ff70fecc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.844291788 +0000 UTC m=+2.476524806,LastTimestamp:2026-03-17 01:09:36.844291788 +0000 UTC m=+2.476524806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.913078 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4fff6b888 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.853055624 +0000 UTC m=+2.485288632,LastTimestamp:2026-03-17 01:09:36.853055624 +0000 UTC m=+2.485288632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.917522 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba50ecfa3c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.102152641 +0000 UTC m=+2.734385629,LastTimestamp:2026-03-17 01:09:37.102152641 +0000 UTC m=+2.734385629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.922755 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba50f1837fa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.106909178 +0000 UTC m=+2.739142196,LastTimestamp:2026-03-17 01:09:37.106909178 +0000 UTC m=+2.739142196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.929418 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba50f68f01e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.112199198 +0000 UTC m=+2.744432226,LastTimestamp:2026-03-17 01:09:37.112199198 +0000 UTC m=+2.744432226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.934425 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba50fd060cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.118978255 +0000 UTC m=+2.751211263,LastTimestamp:2026-03-17 01:09:37.118978255 +0000 UTC m=+2.751211263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.938785 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba50fdbbdf1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.119722993 +0000 UTC m=+2.751955981,LastTimestamp:2026-03-17 01:09:37.119722993 +0000 UTC m=+2.751955981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.945077 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba510507b7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.127373691 +0000 UTC m=+2.759606709,LastTimestamp:2026-03-17 01:09:37.127373691 +0000 UTC m=+2.759606709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.951038 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba5207c8fbb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.398697915 +0000 UTC m=+3.030930883,LastTimestamp:2026-03-17 01:09:37.398697915 +0000 UTC m=+3.030930883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.954947 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba52113726d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.408586349 +0000 UTC m=+3.040819327,LastTimestamp:2026-03-17 01:09:37.408586349 +0000 UTC m=+3.040819327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.960917 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba5214249d0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.411656144 +0000 UTC m=+3.043889112,LastTimestamp:2026-03-17 01:09:37.411656144 +0000 UTC m=+3.043889112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.965028 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba521b93311 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.419449105 +0000 UTC m=+3.051682103,LastTimestamp:2026-03-17 01:09:37.419449105 +0000 UTC m=+3.051682103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.971067 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba522bdb499 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.436521625 +0000 UTC m=+3.068754603,LastTimestamp:2026-03-17 01:09:37.436521625 +0000 UTC m=+3.068754603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.975558 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba522c26d4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.436831053 +0000 UTC m=+3.069064031,LastTimestamp:2026-03-17 01:09:37.436831053 +0000 UTC m=+3.069064031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.979811 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba52328bcaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.443536047 +0000 UTC m=+3.075769025,LastTimestamp:2026-03-17 01:09:37.443536047 +0000 UTC m=+3.075769025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.984106 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d7ba523fc336a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.457394538 +0000 UTC m=+3.089627516,LastTimestamp:2026-03-17 01:09:37.457394538 +0000 UTC m=+3.089627516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.989714 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba52477cb24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.465494308 +0000 UTC m=+3.097727286,LastTimestamp:2026-03-17 01:09:37.465494308 +0000 UTC m=+3.097727286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:09 crc kubenswrapper[4735]: E0317 01:10:09.996201 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba524a3ef61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.468387169 +0000 UTC m=+3.100620147,LastTimestamp:2026-03-17 01:09:37.468387169 +0000 UTC m=+3.100620147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.000281 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba52e36f1be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.62901651 +0000 UTC m=+3.261249488,LastTimestamp:2026-03-17 01:09:37.62901651 +0000 UTC m=+3.261249488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.005184 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba52f020d15 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.642327317 +0000 UTC m=+3.274560295,LastTimestamp:2026-03-17 01:09:37.642327317 +0000 UTC m=+3.274560295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.005949 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.011598 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba52f137716 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.643468566 +0000 UTC m=+3.275701544,LastTimestamp:2026-03-17 01:09:37.643468566 +0000 UTC m=+3.275701544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.016729 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba532651f3b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.699151675 +0000 UTC m=+3.331384643,LastTimestamp:2026-03-17 01:09:37.699151675 +0000 UTC m=+3.331384643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.021053 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba5334a82c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.714184905 +0000 UTC m=+3.346417923,LastTimestamp:2026-03-17 01:09:37.714184905 +0000 UTC m=+3.346417923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.025797 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba5335e5ca1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.715485857 +0000 UTC m=+3.347718835,LastTimestamp:2026-03-17 01:09:37.715485857 +0000 UTC m=+3.347718835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.030567 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba53b93fcc1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.853217985 +0000 UTC m=+3.485450973,LastTimestamp:2026-03-17 01:09:37.853217985 +0000 UTC m=+3.485450973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.035042 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d7ba53ce1897f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.875077503 +0000 UTC m=+3.507310491,LastTimestamp:2026-03-17 01:09:37.875077503 +0000 UTC m=+3.507310491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.040420 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.041328 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba53fd10774 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.924327284 +0000 UTC m=+3.556560272,LastTimestamp:2026-03-17 01:09:37.924327284 +0000 UTC m=+3.556560272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.041601 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.042484 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba540aa51b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.938567604 +0000 UTC m=+3.570800582,LastTimestamp:2026-03-17 01:09:37.938567604 +0000 UTC m=+3.570800582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.043479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.043511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.043525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.043555 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.049402 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.049665 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba540bbac1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:37.939704863 +0000 UTC m=+3.571937881,LastTimestamp:2026-03-17 01:09:37.939704863 +0000 UTC m=+3.571937881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.058591 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba54d511b0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.150824716 +0000 UTC m=+3.783057724,LastTimestamp:2026-03-17 01:09:38.150824716 +0000 UTC m=+3.783057724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.064679 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba54db42485 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.157315205 +0000 UTC m=+3.789548223,LastTimestamp:2026-03-17 01:09:38.157315205 +0000 UTC m=+3.789548223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.071185 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba54f6cb70b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.186188555 +0000 UTC m=+3.818421533,LastTimestamp:2026-03-17 01:09:38.186188555 +0000 UTC m=+3.818421533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.075424 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba54f8e4c21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.188389409 +0000 UTC m=+3.820622387,LastTimestamp:2026-03-17 01:09:38.188389409 +0000 UTC m=+3.820622387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.080376 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba55c5bb807 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.403178503 +0000 UTC m=+4.035411481,LastTimestamp:2026-03-17 01:09:38.403178503 +0000 UTC m=+4.035411481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.085812 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba55cef06b8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.41283244 +0000 UTC m=+4.045065418,LastTimestamp:2026-03-17 01:09:38.41283244 +0000 UTC m=+4.045065418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.092310 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba55d23d145 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.416292165 +0000 UTC m=+4.048525143,LastTimestamp:2026-03-17 01:09:38.416292165 +0000 UTC m=+4.048525143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.098981 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba55d7aa3fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.421982205 +0000 UTC m=+4.054215183,LastTimestamp:2026-03-17 01:09:38.421982205 +0000 UTC m=+4.054215183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.107420 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba589e1d9f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.166943729 +0000 UTC m=+4.799176747,LastTimestamp:2026-03-17 01:09:39.166943729 +0000 UTC m=+4.799176747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.117757 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba598d9acbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.418066107 +0000 UTC m=+5.050299095,LastTimestamp:2026-03-17 01:09:39.418066107 +0000 UTC m=+5.050299095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.123903 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba599a361dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.431285212 +0000 UTC m=+5.063518210,LastTimestamp:2026-03-17 01:09:39.431285212 +0000 UTC m=+5.063518210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.132053 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba599b30141 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.432309057 +0000 UTC m=+5.064542065,LastTimestamp:2026-03-17 01:09:39.432309057 +0000 UTC m=+5.064542065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.139187 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5aae4c4ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.720783103 +0000 UTC m=+5.353016121,LastTimestamp:2026-03-17 01:09:39.720783103 +0000 UTC m=+5.353016121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.146997 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5abec88ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.738069198 +0000 UTC m=+5.370302206,LastTimestamp:2026-03-17 01:09:39.738069198 +0000 UTC m=+5.370302206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.153142 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5ac1820f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:39.740926198 +0000 UTC m=+5.373159216,LastTimestamp:2026-03-17 01:09:39.740926198 +0000 UTC m=+5.373159216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.157915 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5bbcd2549 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.004447561 +0000 UTC m=+5.636680569,LastTimestamp:2026-03-17 01:09:40.004447561 +0000 UTC m=+5.636680569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.164816 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5bccb4db2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.02110405 +0000 UTC m=+5.653337068,LastTimestamp:2026-03-17 01:09:40.02110405 +0000 UTC m=+5.653337068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.169935 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5bce33223 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.022669859 +0000 UTC m=+5.654902867,LastTimestamp:2026-03-17 01:09:40.022669859 +0000 UTC m=+5.654902867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.176674 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5cf35771f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.330051359 +0000 UTC m=+5.962284377,LastTimestamp:2026-03-17 01:09:40.330051359 +0000 UTC m=+5.962284377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.182646 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5d02734a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.345894049 +0000 UTC m=+5.978127067,LastTimestamp:2026-03-17 01:09:40.345894049 +0000 UTC m=+5.978127067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.188957 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5d040f2f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.347581171 +0000 UTC m=+5.979814179,LastTimestamp:2026-03-17 01:09:40.347581171 +0000 UTC m=+5.979814179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.198115 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5df8d8454 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.604257364 +0000 UTC m=+6.236490372,LastTimestamp:2026-03-17 01:09:40.604257364 +0000 UTC m=+6.236490372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.204797 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d7ba5e0bb3aa3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:40.624030371 +0000 UTC m=+6.256263339,LastTimestamp:2026-03-17 01:09:40.624030371 +0000 UTC m=+6.256263339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.213477 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f8b56153 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 01:10:10 crc kubenswrapper[4735]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 01:10:10 crc kubenswrapper[4735]: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.616234835 +0000 UTC m=+15.248467833,LastTimestamp:2026-03-17 01:09:49.616234835 +0000 UTC m=+15.248467833,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.220463 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f8b6bcb8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.616323768 +0000 UTC m=+15.248556766,LastTimestamp:2026-03-17 01:09:49.616323768 +0000 UTC m=+15.248556766,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.226891 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d7ba7f8b56153\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f8b56153 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 01:10:10 crc kubenswrapper[4735]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 01:10:10 crc kubenswrapper[4735]: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.616234835 +0000 UTC m=+15.248467833,LastTimestamp:2026-03-17 01:09:49.621110456 +0000 UTC m=+15.253343444,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.231219 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d7ba7f8b6bcb8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f8b6bcb8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.616323768 +0000 UTC m=+15.248556766,LastTimestamp:2026-03-17 01:09:49.621153457 +0000 UTC m=+15.253386445,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.236489 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f98cb8e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:55968->192.168.126.11:17697: read: connection reset by peer Mar 17 01:10:10 crc kubenswrapper[4735]: body: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.630347493 +0000 UTC m=+15.262580481,LastTimestamp:2026-03-17 01:09:49.630347493 +0000 UTC m=+15.262580481,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.241357 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba7f98dbd2d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55968->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:49.630414125 +0000 UTC m=+15.262647113,LastTimestamp:2026-03-17 01:09:49.630414125 +0000 UTC m=+15.262647113,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.248565 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7ba819155dfa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 01:10:10 crc kubenswrapper[4735]: body: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159396346 +0000 UTC m=+15.791629354,LastTimestamp:2026-03-17 01:09:50.159396346 +0000 UTC m=+15.791629354,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.254463 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba81916594f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159460687 +0000 UTC m=+15.791693705,LastTimestamp:2026-03-17 01:09:50.159460687 +0000 UTC m=+15.791693705,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.262559 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d7ba54f8e4c21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7ba54f8e4c21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:38.188389409 +0000 UTC m=+3.820622387,LastTimestamp:2026-03-17 01:09:50.224774664 +0000 UTC m=+15.857007672,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.271746 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba819155dfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7ba819155dfa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 01:10:10 crc kubenswrapper[4735]: body: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159396346 +0000 UTC m=+15.791629354,LastTimestamp:2026-03-17 01:10:00.158753103 +0000 UTC m=+25.790986091,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.279640 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba81916594f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba81916594f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159460687 +0000 UTC m=+15.791693705,LastTimestamp:2026-03-17 01:10:00.158815305 +0000 UTC m=+25.791048293,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.288451 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 01:10:10 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7bac24797c52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:39282->192.168.126.11:10357: read: connection reset by peer Mar 17 01:10:10 crc kubenswrapper[4735]: body: Mar 17 01:10:10 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:10:07.530376274 +0000 UTC m=+33.162609282,LastTimestamp:2026-03-17 01:10:07.530376274 +0000 UTC m=+33.162609282,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:10 crc kubenswrapper[4735]: > Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.295945 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7bac247a8392 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39282->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:10:07.530443666 +0000 UTC m=+33.162676674,LastTimestamp:2026-03-17 01:10:07.530443666 +0000 UTC m=+33.162676674,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.302579 4735 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7bac24ac1425 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:10:07.533691941 +0000 UTC m=+33.165924949,LastTimestamp:2026-03-17 01:10:07.533691941 +0000 UTC m=+33.165924949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.307141 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.308418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.308469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:10 crc kubenswrapper[4735]: I0317 01:10:10.308483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.310932 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba4db13d5d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4db13d5d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.234206678 +0000 UTC m=+1.866439666,LastTimestamp:2026-03-17 01:10:08.056689428 +0000 UTC m=+33.688922416,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.318033 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba4eab53782\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4eab53782 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.496441218 +0000 UTC m=+2.128674186,LastTimestamp:2026-03-17 01:10:08.3012761 +0000 UTC m=+33.933509078,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:10 crc kubenswrapper[4735]: E0317 01:10:10.320584 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba4eb82f06c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba4eb82f06c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:36.509923436 +0000 UTC m=+2.142156444,LastTimestamp:2026-03-17 01:10:08.312714947 +0000 UTC m=+33.944947925,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.006344 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.267798 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.311012 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.312500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.312559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:11 crc kubenswrapper[4735]: I0317 01:10:11.312577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.007459 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.072941 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.075128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.075353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.075489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:12 crc kubenswrapper[4735]: I0317 01:10:12.076950 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.007157 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.321559 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.322777 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.326057 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" exitCode=255 Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.326298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7"} Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.326477 4735 scope.go:117] "RemoveContainer" containerID="9a9fe93d66053e157ce685a20d9c0d01f037ba4e50c2d91f464e9e7871a131f5" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.326900 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.328954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.329496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.329816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:13 crc kubenswrapper[4735]: I0317 01:10:13.331238 4735 scope.go:117] "RemoveContainer" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" Mar 17 01:10:13 crc kubenswrapper[4735]: E0317 01:10:13.332504 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:14 crc kubenswrapper[4735]: I0317 01:10:14.006494 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:14 crc kubenswrapper[4735]: I0317 01:10:14.332046 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 01:10:14 crc kubenswrapper[4735]: I0317 01:10:14.766257 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 01:10:14 crc kubenswrapper[4735]: I0317 01:10:14.784842 4735 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 01:10:15 crc kubenswrapper[4735]: I0317 01:10:15.007125 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:15 crc kubenswrapper[4735]: E0317 01:10:15.134968 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:16 crc kubenswrapper[4735]: I0317 01:10:16.005695 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:16 crc kubenswrapper[4735]: W0317 01:10:16.811192 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:16 crc kubenswrapper[4735]: E0317 01:10:16.811267 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.005731 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:17 crc kubenswrapper[4735]: E0317 01:10:17.049405 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.049915 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.051756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.051820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.051840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.051907 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:17 crc kubenswrapper[4735]: E0317 01:10:17.059380 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.159131 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.159395 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.161482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.161592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:17 crc kubenswrapper[4735]: I0317 01:10:17.161616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.005637 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:18 crc kubenswrapper[4735]: W0317 01:10:18.119095 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 17 01:10:18 crc kubenswrapper[4735]: E0317 01:10:18.119185 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.958463 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.958763 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.960783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.960852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.960897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:18 crc kubenswrapper[4735]: I0317 01:10:18.961768 4735 scope.go:117] "RemoveContainer" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" Mar 17 01:10:18 crc kubenswrapper[4735]: E0317 01:10:18.962110 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:19 crc kubenswrapper[4735]: I0317 01:10:19.005418 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.006422 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.160016 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.160110 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:10:20 crc kubenswrapper[4735]: E0317 01:10:20.167983 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba819155dfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 01:10:20 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7ba819155dfa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 01:10:20 crc kubenswrapper[4735]: body: Mar 17 01:10:20 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159396346 +0000 UTC m=+15.791629354,LastTimestamp:2026-03-17 01:10:20.160084299 +0000 UTC m=+45.792317307,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:20 crc kubenswrapper[4735]: > Mar 17 01:10:20 crc kubenswrapper[4735]: E0317 01:10:20.175798 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba81916594f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7ba81916594f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159460687 +0000 UTC m=+15.791693705,LastTimestamp:2026-03-17 01:10:20.160154971 +0000 UTC m=+45.792387999,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.355632 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.356023 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.357911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.358003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.358024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:20 crc kubenswrapper[4735]: I0317 01:10:20.359008 4735 scope.go:117] "RemoveContainer" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" Mar 17 01:10:20 crc kubenswrapper[4735]: E0317 01:10:20.359319 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:21 crc kubenswrapper[4735]: I0317 01:10:21.006053 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:22 crc kubenswrapper[4735]: I0317 01:10:22.005245 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:22 crc kubenswrapper[4735]: W0317 01:10:22.540005 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 17 01:10:22 crc kubenswrapper[4735]: E0317 01:10:22.540086 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.004124 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.398341 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.399197 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.401115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.401189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:23 crc kubenswrapper[4735]: I0317 01:10:23.401208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.006176 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.060557 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:24 crc kubenswrapper[4735]: E0317 01:10:24.060979 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.066125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.066191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.066296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:24 crc kubenswrapper[4735]: I0317 01:10:24.066442 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:24 crc kubenswrapper[4735]: E0317 01:10:24.076975 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 01:10:25 crc kubenswrapper[4735]: I0317 01:10:25.006692 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:25 crc kubenswrapper[4735]: E0317 01:10:25.135814 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:25 crc kubenswrapper[4735]: W0317 01:10:25.662854 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 17 01:10:25 crc kubenswrapper[4735]: E0317 01:10:25.662970 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 01:10:26 crc kubenswrapper[4735]: I0317 01:10:26.006295 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:27 crc kubenswrapper[4735]: I0317 01:10:27.005293 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:28 crc kubenswrapper[4735]: I0317 01:10:28.007932 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:29 crc kubenswrapper[4735]: I0317 01:10:29.004401 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:30 crc kubenswrapper[4735]: I0317 01:10:30.003971 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:30 crc kubenswrapper[4735]: I0317 01:10:30.159066 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:10:30 crc kubenswrapper[4735]: I0317 01:10:30.159218 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:10:30 crc kubenswrapper[4735]: E0317 01:10:30.168922 4735 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7ba819155dfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 01:10:30 crc kubenswrapper[4735]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7ba819155dfa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 01:10:30 crc kubenswrapper[4735]: body: Mar 17 01:10:30 crc kubenswrapper[4735]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:09:50.159396346 +0000 UTC m=+15.791629354,LastTimestamp:2026-03-17 01:10:30.159183979 +0000 UTC m=+55.791416997,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 01:10:30 crc kubenswrapper[4735]: > Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.006830 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:31 crc kubenswrapper[4735]: E0317 01:10:31.070348 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.077831 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.079739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.080000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.080154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:31 crc kubenswrapper[4735]: I0317 01:10:31.080316 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:31 crc kubenswrapper[4735]: E0317 01:10:31.088046 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 01:10:32 crc kubenswrapper[4735]: I0317 01:10:32.012215 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:32.999947 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.073005 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.074790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.074837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.074860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.075994 4735 scope.go:117] "RemoveContainer" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.393069 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.395668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59"} Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.395858 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.397180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.397204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:33 crc kubenswrapper[4735]: I0317 01:10:33.397215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.007184 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.401431 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.403269 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.406959 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" exitCode=255 Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.407082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59"} Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.407506 4735 scope.go:117] "RemoveContainer" containerID="6c071796995e688ed656d606ae035b382de73cf5d365b74b1bff9335a459a3b7" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.407693 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.409307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.409575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.412046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:34 crc kubenswrapper[4735]: I0317 01:10:34.413135 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:10:34 crc kubenswrapper[4735]: E0317 01:10:34.413688 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:35 crc kubenswrapper[4735]: I0317 01:10:35.009692 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:35 crc kubenswrapper[4735]: E0317 01:10:35.136456 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:35 crc kubenswrapper[4735]: I0317 01:10:35.413097 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 01:10:36 crc kubenswrapper[4735]: I0317 01:10:36.003311 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.006090 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.162701 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.162913 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.164113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.164180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.164198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.169077 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.422181 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.435827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.436008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:37 crc kubenswrapper[4735]: I0317 01:10:37.436024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.006649 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:38 crc kubenswrapper[4735]: E0317 01:10:38.078404 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.088216 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.090397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.090471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.090490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.090533 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:38 crc kubenswrapper[4735]: E0317 01:10:38.098642 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.958237 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.958476 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.960178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.960226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.960246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:38 crc kubenswrapper[4735]: I0317 01:10:38.961067 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:10:38 crc kubenswrapper[4735]: E0317 01:10:38.961379 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:39 crc kubenswrapper[4735]: I0317 01:10:39.005832 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:39 crc kubenswrapper[4735]: I0317 01:10:39.639940 4735 csr.go:261] certificate signing request csr-zjll6 is approved, waiting to be issued Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.001471 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.355542 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.355719 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.356785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.356812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.356824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.357524 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:10:40 crc kubenswrapper[4735]: E0317 01:10:40.357745 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.621111 4735 csr.go:257] certificate signing request csr-zjll6 is issued Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.684260 4735 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 17 01:10:40 crc kubenswrapper[4735]: I0317 01:10:40.868947 4735 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 17 01:10:41 crc kubenswrapper[4735]: I0317 01:10:41.622409 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 08:18:26.895018571 +0000 UTC Mar 17 01:10:41 crc kubenswrapper[4735]: I0317 01:10:41.622462 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6679h7m45.27256018s for next certificate rotation Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.099436 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.100680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.100713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.100722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.100828 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.116659 4735 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.117019 4735 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.117057 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.120884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.120945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.120954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.120973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.120988 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:45Z","lastTransitionTime":"2026-03-17T01:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.137503 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.138782 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.147669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.147698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.147709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.147728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.147739 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:45Z","lastTransitionTime":"2026-03-17T01:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.157004 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.164338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.164409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.164425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.164450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.164462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:45Z","lastTransitionTime":"2026-03-17T01:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.178459 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.187016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.187069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.187085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.187115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:45 crc kubenswrapper[4735]: I0317 01:10:45.187129 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:45Z","lastTransitionTime":"2026-03-17T01:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.197565 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.197728 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.197767 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.298648 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.399586 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.500654 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.601273 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.702203 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.822070 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:45 crc kubenswrapper[4735]: E0317 01:10:45.922295 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.023244 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.124377 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.225042 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.325188 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.426124 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.526611 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.627517 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.728311 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.829145 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:46 crc kubenswrapper[4735]: E0317 01:10:46.929595 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.030150 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.131229 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.232078 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.332319 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.433468 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.534515 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.635632 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.736668 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.837651 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:47 crc kubenswrapper[4735]: E0317 01:10:47.937789 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.038999 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.139694 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.240710 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.341585 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.441715 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.543014 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.643319 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.743951 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.844290 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:48 crc kubenswrapper[4735]: E0317 01:10:48.944821 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.045084 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.145967 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.246445 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.347619 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.448163 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.549241 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.650111 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.751081 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.851598 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:49 crc kubenswrapper[4735]: E0317 01:10:49.951985 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.052826 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.153529 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.254355 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.355374 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.455536 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.556161 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.657184 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.757932 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.858815 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:50 crc kubenswrapper[4735]: E0317 01:10:50.959672 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.060366 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.160592 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.261782 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.362486 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.462772 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.562989 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.664072 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.764422 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.864808 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:51 crc kubenswrapper[4735]: E0317 01:10:51.964978 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.066011 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.167257 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.267548 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.368714 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.469611 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.570178 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.670411 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.770646 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.871796 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:52 crc kubenswrapper[4735]: I0317 01:10:52.875022 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 01:10:52 crc kubenswrapper[4735]: E0317 01:10:52.973106 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.074040 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.174271 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.274692 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.375761 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.476509 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.576994 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.678071 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.778701 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.879941 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:53 crc kubenswrapper[4735]: E0317 01:10:53.980510 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.072159 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.072222 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.074235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:54 crc kubenswrapper[4735]: I0317 01:10:54.075415 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.075697 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.080604 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.181334 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.282078 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.383301 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.484318 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.585039 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.685945 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.786228 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.886899 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:54 crc kubenswrapper[4735]: E0317 01:10:54.987445 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.087681 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.138429 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.188717 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.289483 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.390048 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.431186 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.437346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.437402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.437421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.437449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.437469 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:55Z","lastTransitionTime":"2026-03-17T01:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.455097 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.461494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.461721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.461888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.462057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.462201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:55Z","lastTransitionTime":"2026-03-17T01:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.480794 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.485987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.486038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.486056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.486090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.486110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:55Z","lastTransitionTime":"2026-03-17T01:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.503598 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.515402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.515475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.515497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.515524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:10:55 crc kubenswrapper[4735]: I0317 01:10:55.515543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:10:55Z","lastTransitionTime":"2026-03-17T01:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.532674 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.533044 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.533093 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.633451 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.734374 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.835236 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:55 crc kubenswrapper[4735]: E0317 01:10:55.935605 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.036775 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.137837 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.238544 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.338965 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.439467 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.540478 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.641617 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.742189 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.843395 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:56 crc kubenswrapper[4735]: E0317 01:10:56.944266 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.044569 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.144905 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.245804 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.346773 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.447568 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.548728 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.648927 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.750158 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.851204 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:57 crc kubenswrapper[4735]: E0317 01:10:57.951538 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.051927 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.152818 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.254118 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.355810 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.456982 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.558035 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.658851 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.759803 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: I0317 01:10:58.845369 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.860698 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:58 crc kubenswrapper[4735]: E0317 01:10:58.961257 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.062155 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: I0317 01:10:59.072637 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:10:59 crc kubenswrapper[4735]: I0317 01:10:59.074186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:10:59 crc kubenswrapper[4735]: I0317 01:10:59.074237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:10:59 crc kubenswrapper[4735]: I0317 01:10:59.074253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.163273 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.263479 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.363585 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.463942 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.564749 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.665844 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.766943 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.867623 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:10:59 crc kubenswrapper[4735]: E0317 01:10:59.967917 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.068998 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.169907 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.270174 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.371168 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.472018 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.573094 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.674099 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.775172 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.876498 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:00 crc kubenswrapper[4735]: E0317 01:11:00.977279 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.077612 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.178945 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.279372 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.381106 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.481565 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.582688 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.683923 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.784086 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.884243 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:01 crc kubenswrapper[4735]: E0317 01:11:01.984817 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.085282 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.186148 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.286973 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.387372 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: I0317 01:11:02.407934 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.488491 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.589572 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.690784 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.791891 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.893213 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:02 crc kubenswrapper[4735]: E0317 01:11:02.994420 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.095077 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.195979 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.296453 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.397138 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.497713 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.598218 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.698803 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.799722 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:03 crc kubenswrapper[4735]: E0317 01:11:03.900679 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.001487 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.102650 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.203635 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.304374 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.405408 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.505511 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.606408 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.708040 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.809244 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:04 crc kubenswrapper[4735]: E0317 01:11:04.909916 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.010083 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.072223 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.073628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.073781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.073909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.074723 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.075089 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.111053 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.139365 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.211384 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.311734 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.412167 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.513338 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.613711 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.705698 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.711943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.712010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.712031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.712062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.712080 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:05Z","lastTransitionTime":"2026-03-17T01:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.728590 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.736282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.736372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.736398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.736430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.736464 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:05Z","lastTransitionTime":"2026-03-17T01:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.752461 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.757514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.757576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.757600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.757631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.757655 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:05Z","lastTransitionTime":"2026-03-17T01:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.772920 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.778419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.778480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.778503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.778539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:05 crc kubenswrapper[4735]: I0317 01:11:05.778563 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:05Z","lastTransitionTime":"2026-03-17T01:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.794044 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.794790 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.795039 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.895656 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:05 crc kubenswrapper[4735]: E0317 01:11:05.997039 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.097884 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.199005 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.299932 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.400074 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.500625 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.601609 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.702849 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.803584 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:06 crc kubenswrapper[4735]: E0317 01:11:06.904468 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:07 crc kubenswrapper[4735]: E0317 01:11:07.005584 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:07 crc kubenswrapper[4735]: E0317 01:11:07.106329 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.135898 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.209227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.209575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.209800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.210076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.210327 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.313692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.314109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.314279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.314449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.314618 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.419147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.420030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.420073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.420105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.420129 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.523111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.523175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.523197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.523225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.523284 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.626593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.626674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.626699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.626725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.626748 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.730407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.731017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.731211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.731390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.731526 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.834451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.834522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.834541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.834569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.834585 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.937617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.937681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.937697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.937722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:07 crc kubenswrapper[4735]: I0317 01:11:07.937739 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:07Z","lastTransitionTime":"2026-03-17T01:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.039968 4735 apiserver.go:52] "Watching apiserver" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.041268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.041324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.041341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.041364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.041381 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.045384 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.046031 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.046963 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.047480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.047579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.047697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.047826 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.047965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.048158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.048064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.048668 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.053387 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.053616 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.053785 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.053846 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.053977 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.054406 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.054433 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.055104 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.055362 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.083356 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.104041 4735 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.105710 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.121666 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.132655 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.144481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.144520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.144532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.144549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.144564 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.149289 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.162838 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.173970 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.174204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.174305 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.174425 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.174516 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.175132 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.175435 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.176007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.176439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177814 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177928 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178516 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178900 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.176985 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179295 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179357 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179421 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179575 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179610 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179669 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179697 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179812 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179883 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179934 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180035 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180088 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180173 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180233 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180315 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180373 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180451 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180476 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180500 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180526 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180607 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180661 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180710 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180756 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180897 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180924 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181007 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181032 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181057 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177063 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177101 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.177751 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178355 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178448 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.178848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.179220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.180556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.181565 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:08.681531743 +0000 UTC m=+94.313764931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181775 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.181091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182233 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182327 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182435 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182471 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182505 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182572 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182607 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182639 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182671 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182896 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182965 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.182998 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183033 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183065 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183137 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183210 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183233 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183286 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183307 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183326 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183434 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183473 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184042 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184114 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184191 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184210 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184207 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184229 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.183958 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184567 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184808 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184916 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185115 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185350 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185554 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185642 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185735 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185826 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186015 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186110 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186308 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186436 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186555 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186648 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186842 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186950 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187053 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187247 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187340 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187432 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187527 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187754 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187875 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187979 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188084 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188524 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188624 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188755 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189047 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189161 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189253 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189345 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189546 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187116 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191124 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.184820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185133 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185368 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185629 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185734 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.185847 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186263 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186563 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186567 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186652 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186890 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187065 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187191 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187417 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187642 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187699 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187800 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.186099 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.187988 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188621 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188763 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.188340 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189337 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.189916 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.190814 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.191932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192101 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192130 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192217 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192652 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193175 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193488 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194082 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194178 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194260 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194535 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194608 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194876 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195268 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195355 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195440 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196146 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196303 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196405 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196502 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196617 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196997 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197356 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197590 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198202 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198553 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198692 4735 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198780 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198922 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199027 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199114 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199187 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199260 4735 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199334 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199410 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199561 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199658 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199770 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199809 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199828 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199848 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199968 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199994 4735 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200014 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200032 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200047 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200063 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200078 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200093 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200109 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200126 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200142 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200158 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200173 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200186 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200200 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200214 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200228 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200257 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200272 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200287 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200301 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200316 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200329 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200343 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200357 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200370 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200386 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200401 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200415 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200431 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.202895 4735 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.192990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193192 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.207552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.207609 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.207751 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.207805 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.207823 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194253 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194489 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.193984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.194947 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195142 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195467 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.195544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196046 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196113 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196173 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196453 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196615 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196881 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196886 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.196974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197342 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197720 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197745 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197919 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.197997 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198424 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198816 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198847 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.198963 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199230 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199401 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199638 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199506 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.199732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200021 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200246 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.200983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.201220 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.201266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.201349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.201551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.202109 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.202211 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.202560 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.203997 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.204028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.204061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.205053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.205277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.205533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.205554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.206234 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.206334 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208227 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.206456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.208374 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208447 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208578 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208576 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.208691 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:08.708660124 +0000 UTC m=+94.340893182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.208982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.209209 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:08.709194957 +0000 UTC m=+94.341428175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.209925 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.211068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.211560 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.212032 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.212898 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.212970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213242 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213346 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213698 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.213801 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.216969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.218413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.220641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222374 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222402 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222419 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222500 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:08.722471171 +0000 UTC m=+94.354704389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222785 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222807 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222819 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.222922 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:08.72284759 +0000 UTC m=+94.355080578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.223413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.223517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.224249 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.224277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.224813 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.225159 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.226406 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.229491 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.236635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.239314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.241778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.241976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.242163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.242395 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.242809 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.242882 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.243130 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.243533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.243982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.244143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.244200 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.247523 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.248432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.248590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.248717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.249010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.249017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.249100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.249734 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.249920 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.251208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.251239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.251276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.251300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.251314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.252019 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.253303 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.253367 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.259043 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.259803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.271117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.278022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.278985 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301778 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301794 4735 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301809 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301823 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301837 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301851 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301884 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301899 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301912 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301925 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301939 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301952 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301965 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301978 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.301991 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302003 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302019 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302031 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302044 4735 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302058 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302070 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302083 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302097 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302110 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302122 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302137 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302150 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302161 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302176 4735 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302188 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302201 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302215 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302246 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302260 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302274 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302288 4735 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302301 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302314 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302327 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302340 4735 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302352 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302367 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302381 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302396 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302413 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302426 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302439 4735 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302452 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302465 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302481 4735 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302495 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302509 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302521 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302534 4735 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302547 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302576 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302592 4735 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302606 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302618 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302634 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302646 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302659 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302672 4735 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302684 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302697 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302711 4735 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302725 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302738 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302750 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302763 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302776 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302789 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302801 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302814 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302826 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302838 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302851 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302878 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302891 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302905 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302917 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302929 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302941 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302953 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302965 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302977 4735 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.302990 4735 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303003 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303015 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303029 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303042 4735 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303054 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303068 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303079 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303091 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303103 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303115 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303126 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303138 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303151 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303162 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303174 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303186 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303197 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303209 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303223 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303235 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303247 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303259 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303270 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303282 4735 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303293 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303306 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303318 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303330 4735 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303343 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303354 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303376 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303388 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303401 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303414 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303427 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303439 4735 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303450 4735 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303462 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303473 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303487 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303499 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303511 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303525 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303538 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303551 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303564 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303577 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303589 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303601 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303614 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303626 4735 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303639 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303652 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303664 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303676 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303689 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303702 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303715 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303729 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303743 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303755 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303812 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.303981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.354643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.354685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.354702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.354724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.354740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.372623 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.378938 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.393389 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 01:11:08 crc kubenswrapper[4735]: W0317 01:11:08.399275 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-746735a5e30856bf03304ebf651b04bc61b446d499b33fb5538927a07a751dc5 WatchSource:0}: Error finding container 746735a5e30856bf03304ebf651b04bc61b446d499b33fb5538927a07a751dc5: Status 404 returned error can't find the container with id 746735a5e30856bf03304ebf651b04bc61b446d499b33fb5538927a07a751dc5 Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.457312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.457351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.457362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.457381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.457399 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.512712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e33a45ecda375655d2bcc0b57415c1e5c1c4c5b8d1c67f7139e88f1a312b81a7"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.513933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"746735a5e30856bf03304ebf651b04bc61b446d499b33fb5538927a07a751dc5"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.515035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dcd4fe7c38f9f54ce9af41f886d06a92be5c4cf84613802b67a44d1ddd82c39c"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.560113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.560166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.560184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.560214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.560234 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.665025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.665063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.665080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.665100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.665114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.707402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.707724 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:09.707674242 +0000 UTC m=+95.339907360 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.768142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.768180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.768189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.768206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.768221 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.808222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.808277 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.808298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.808580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809334 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809418 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809443 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809561 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:09.809515621 +0000 UTC m=+95.441748609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809446 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809551 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809700 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:09.809674336 +0000 UTC m=+95.441907354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.809847 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:09.809788179 +0000 UTC m=+95.442021167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.814773 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.814808 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.814831 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: E0317 01:11:08.814942 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:09.814915389 +0000 UTC m=+95.447148367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.871145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.871207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.871216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.871237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.871249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.977617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.978083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.978093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.978109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:08 crc kubenswrapper[4735]: I0317 01:11:08.978121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:08Z","lastTransitionTime":"2026-03-17T01:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.079778 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.080678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.080749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.080772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.080799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.080817 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.082088 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.084350 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.085777 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.087839 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.089032 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.090608 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.092590 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.094074 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.096180 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.097724 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.100247 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.101547 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.102660 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.104645 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.105996 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.108004 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.109219 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.110452 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.112586 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.113658 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.115374 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.116099 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.117729 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.118544 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.120174 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.121181 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.121961 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.123340 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.124034 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.125128 4735 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.125274 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.127924 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.129320 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.129833 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.131998 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.132795 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.134128 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.135578 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.136930 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.137580 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.138912 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.140192 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.141129 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.141988 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.143523 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.144630 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.147801 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.148920 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.150803 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.151889 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.153027 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.155168 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.156281 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.184425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.184468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.184478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.184497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.184508 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.288048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.288110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.288122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.288138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.288165 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.391417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.391724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.391853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.392015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.392168 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.495030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.495085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.495098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.495117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.495130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.521131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.521207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.523553 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.536986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.554626 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.570888 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.586324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.597720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.597782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.597802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.597831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.597852 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.605951 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.627092 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.645303 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.661460 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.678056 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.699611 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.700417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.700458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.700473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.700497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.700517 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.713733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.717043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.717214 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:11.717189679 +0000 UTC m=+97.349422667 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.734308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:09Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.804585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.804643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.804654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.804677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.804693 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.818479 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.818539 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.818570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.818602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.818741 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.818822 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:11.818798113 +0000 UTC m=+97.451031101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819035 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819197 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:11.819175093 +0000 UTC m=+97.451408061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819280 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819297 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819333 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.819380 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:11.819364417 +0000 UTC m=+97.451597395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.820343 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.820655 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.821016 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:09 crc kubenswrapper[4735]: E0317 01:11:09.822417 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:11.822393589 +0000 UTC m=+97.454626567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.907997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.908049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.908062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.908082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:09 crc kubenswrapper[4735]: I0317 01:11:09.908094 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:09Z","lastTransitionTime":"2026-03-17T01:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.011188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.011237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.011246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.011263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.011274 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.072403 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.072460 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.072483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.072599 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.072694 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.072798 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.114431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.114492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.114502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.114521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.114533 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.217678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.217742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.217754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.217776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.217787 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.320495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.320549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.320563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.320585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.320601 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.423307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.423345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.423357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.423375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.423388 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.530212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.530253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.530265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.530290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.530304 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.633131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.633188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.633200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.633218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.633232 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.736403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.736467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.736490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.736523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.736546 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.769518 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2s9p2"] Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.769999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.773375 4735 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.773415 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.773706 4735 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.773741 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.774916 4735 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.775051 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.778057 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mm58f"] Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.778323 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.778594 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-z669m"] Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.779190 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.782161 4735 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.782196 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.782275 4735 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.782292 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.782360 4735 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.782376 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.783264 4735 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.783295 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.783314 4735 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.783328 4735 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.783391 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.783407 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.783774 4735 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.783822 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.784076 4735 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: W0317 01:11:10.784102 4735 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.784134 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: E0317 01:11:10.784103 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.795769 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.804597 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.823584 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.841066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.841115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.841127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.841147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.841160 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.845943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.868497 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.888929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.916977 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97qmd\" (UniqueName: \"kubernetes.io/projected/0fe43c53-58a2-4450-a71c-667e10384678-kube-api-access-97qmd\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-bin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928161 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fe43c53-58a2-4450-a71c-667e10384678-mcd-auth-proxy-config\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-etc-kubernetes\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-system-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928255 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928278 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d065a5d-f163-4cbd-8790-023a32481e7b-hosts-file\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g86g4\" (UniqueName: \"kubernetes.io/projected/9d065a5d-f163-4cbd-8790-023a32481e7b-kube-api-access-g86g4\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cnibin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-multus\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928357 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-kubelet\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928399 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-conf-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-multus-certs\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928436 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-os-release\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928451 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-daemon-config\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928471 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-k8s-cni-cncf-io\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928489 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-netns\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928568 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtql\" (UniqueName: \"kubernetes.io/projected/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-kube-api-access-lqtql\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0fe43c53-58a2-4450-a71c-667e10384678-rootfs\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928648 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-socket-dir-parent\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.928667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-hostroot\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.930577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943755 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.943906 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:10Z","lastTransitionTime":"2026-03-17T01:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.960884 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.978673 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:10 crc kubenswrapper[4735]: I0317 01:11:10.990719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:10Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.008408 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.024693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029620 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-bin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97qmd\" (UniqueName: \"kubernetes.io/projected/0fe43c53-58a2-4450-a71c-667e10384678-kube-api-access-97qmd\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029786 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-bin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fe43c53-58a2-4450-a71c-667e10384678-mcd-auth-proxy-config\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029927 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-etc-kubernetes\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029947 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-system-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.029971 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cnibin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d065a5d-f163-4cbd-8790-023a32481e7b-hosts-file\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030047 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g86g4\" (UniqueName: \"kubernetes.io/projected/9d065a5d-f163-4cbd-8790-023a32481e7b-kube-api-access-g86g4\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-multus-certs\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-multus\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-kubelet\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-conf-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-os-release\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-daemon-config\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtql\" (UniqueName: \"kubernetes.io/projected/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-kube-api-access-lqtql\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-k8s-cni-cncf-io\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-netns\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0fe43c53-58a2-4450-a71c-667e10384678-rootfs\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-socket-dir-parent\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-hostroot\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-hostroot\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-conf-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-kubelet\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030507 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-os-release\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030560 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-etc-kubernetes\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-system-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.030949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cnibin\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-netns\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d065a5d-f163-4cbd-8790-023a32481e7b-hosts-file\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-cni-dir\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-k8s-cni-cncf-io\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031109 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-var-lib-cni-multus\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031115 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-host-run-multus-certs\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0fe43c53-58a2-4450-a71c-667e10384678-rootfs\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.031226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-socket-dir-parent\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.033036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-multus-daemon-config\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.043244 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.046411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.046458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.046470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.046494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.046507 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.054617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.071001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.148873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.148909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.148918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.148936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.148947 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.229527 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9pv7f"] Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.230366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.232747 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.234223 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5mhq"] Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.236947 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.242042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.248900 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.248903 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.249187 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.249355 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.249575 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.249724 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.249821 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.255040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.255086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.255097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.255118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.255130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.266986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.286087 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.307933 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.331371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.333490 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.334465 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfcq\" (UniqueName: \"kubernetes.io/projected/26e9b32a-eeef-45ee-8107-2a94625bbf6b-kube-api-access-ztfcq\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.334675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-system-cni-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.334727 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.334780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-os-release\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.334970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.335049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cnibin\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.343740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358237 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.358703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.374559 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.391329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.404739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.427349 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cnibin\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbd4\" (UniqueName: \"kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436490 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436514 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436536 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfcq\" (UniqueName: \"kubernetes.io/projected/26e9b32a-eeef-45ee-8107-2a94625bbf6b-kube-api-access-ztfcq\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436648 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-system-cni-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-os-release\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436752 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436787 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.436807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.437064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cnibin\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.437157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-os-release\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.437276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-system-cni-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.437758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26e9b32a-eeef-45ee-8107-2a94625bbf6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.438107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.448236 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.461944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.461973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.461986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.462004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.462017 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.468741 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.488352 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.505058 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.535141 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.536146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.537904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.537954 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.537978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538059 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbd4\" (UniqueName: \"kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538093 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538175 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538344 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538466 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538744 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538771 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.538984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539013 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539058 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539126 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539138 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539206 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539288 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.539319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.540035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.540113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.540339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.544206 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.556308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.560819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbd4\" (UniqueName: \"kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4\") pod \"ovnkube-node-x5mhq\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.564302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.564346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.564358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.564372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.564382 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.567804 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.574552 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.593112 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.607533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.623669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.645683 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.645798 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.652161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fe43c53-58a2-4450-a71c-667e10384678-mcd-auth-proxy-config\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.663695 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.667244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.667297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.667310 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.667333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.667351 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.677926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.698117 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.700804 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.715569 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.727741 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.732315 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.737680 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.741200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.741729 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:15.741699233 +0000 UTC m=+101.373932241 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.748218 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.759822 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.770133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.770191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.770210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.770238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.770260 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.780996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.796586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.811126 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.831072 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.842081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842249 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842269 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842284 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842367 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:15.842351294 +0000 UTC m=+101.474584272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842487 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842527 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842547 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842637 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:15.842608041 +0000 UTC m=+101.474841209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.842688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.842760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.842798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842902 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.842940 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:15.842931288 +0000 UTC m=+101.475164266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.843044 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: E0317 01:11:11.843180 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:15.843146843 +0000 UTC m=+101.475379821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.850058 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.873172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.873209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.873223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.873245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.873261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.976166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.976213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.976230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.976255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.976274 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:11Z","lastTransitionTime":"2026-03-17T01:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:11 crc kubenswrapper[4735]: I0317 01:11:11.995497 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.008365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g86g4\" (UniqueName: \"kubernetes.io/projected/9d065a5d-f163-4cbd-8790-023a32481e7b-kube-api-access-g86g4\") pod \"node-resolver-2s9p2\" (UID: \"9d065a5d-f163-4cbd-8790-023a32481e7b\") " pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.022017 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.030782 4735 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.030945 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls podName:0fe43c53-58a2-4450-a71c-667e10384678 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:12.530910746 +0000 UTC m=+98.163143754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls") pod "machine-config-daemon-z669m" (UID: "0fe43c53-58a2-4450-a71c-667e10384678") : failed to sync secret cache: timed out waiting for the condition Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.031329 4735 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.031401 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy podName:a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d nodeName:}" failed. No retries permitted until 2026-03-17 01:11:12.531384917 +0000 UTC m=+98.163617935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy") pod "multus-mm58f" (UID: "a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d") : failed to sync configmap cache: timed out waiting for the condition Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.073097 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.073150 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.073248 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.073300 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.073509 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:12 crc kubenswrapper[4735]: E0317 01:11:12.073666 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304483 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304501 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.304791 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.305022 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.305044 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.310276 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2s9p2" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.311399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26e9b32a-eeef-45ee-8107-2a94625bbf6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.311770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtql\" (UniqueName: \"kubernetes.io/projected/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-kube-api-access-lqtql\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.312846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97qmd\" (UniqueName: \"kubernetes.io/projected/0fe43c53-58a2-4450-a71c-667e10384678-kube-api-access-97qmd\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.328762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfcq\" (UniqueName: \"kubernetes.io/projected/26e9b32a-eeef-45ee-8107-2a94625bbf6b-kube-api-access-ztfcq\") pod \"multus-additional-cni-plugins-9pv7f\" (UID: \"26e9b32a-eeef-45ee-8107-2a94625bbf6b\") " pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:12 crc kubenswrapper[4735]: W0317 01:11:12.338498 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d065a5d_f163_4cbd_8790_023a32481e7b.slice/crio-80e115605d0829e9b242570c0329791c7f5be9c8fce794d9e3d49c004cca1e1f WatchSource:0}: Error finding container 80e115605d0829e9b242570c0329791c7f5be9c8fce794d9e3d49c004cca1e1f: Status 404 returned error can't find the container with id 80e115605d0829e9b242570c0329791c7f5be9c8fce794d9e3d49c004cca1e1f Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.408040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.408088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.408101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.408121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.408133 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.458830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" Mar 17 01:11:12 crc kubenswrapper[4735]: W0317 01:11:12.473644 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e9b32a_eeef_45ee_8107_2a94625bbf6b.slice/crio-bbede8aa485d1ed4652e6fa709f8d267ebf91d3a5ed78be7120bc65d6bb6eec3 WatchSource:0}: Error finding container bbede8aa485d1ed4652e6fa709f8d267ebf91d3a5ed78be7120bc65d6bb6eec3: Status 404 returned error can't find the container with id bbede8aa485d1ed4652e6fa709f8d267ebf91d3a5ed78be7120bc65d6bb6eec3 Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.518965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.519044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.519059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.519082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.519097 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.543272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerStarted","Data":"bbede8aa485d1ed4652e6fa709f8d267ebf91d3a5ed78be7120bc65d6bb6eec3"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.544320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2s9p2" event={"ID":"9d065a5d-f163-4cbd-8790-023a32481e7b","Type":"ContainerStarted","Data":"80e115605d0829e9b242570c0329791c7f5be9c8fce794d9e3d49c004cca1e1f"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.545537 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" exitCode=0 Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.546019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.546093 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"2f60c084541cd59fca628ae9acd4f0ca315f093783b0040c8723163107ba4d6a"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.552163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.552212 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.554535 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d-cni-binary-copy\") pod \"multus-mm58f\" (UID: \"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\") " pod="openshift-multus/multus-mm58f" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.556417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fe43c53-58a2-4450-a71c-667e10384678-proxy-tls\") pod \"machine-config-daemon-z669m\" (UID: \"0fe43c53-58a2-4450-a71c-667e10384678\") " pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.561554 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.572996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.593182 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.596602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mm58f" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.605623 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.611945 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.628956 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.651838 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.665057 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.678352 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.697777 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.712044 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.731439 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.732697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.732723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.732732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.732749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.732758 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.762266 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gt4rx"] Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.762693 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.765326 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.766285 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.766348 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.766940 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.781706 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.798434 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.819801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.835496 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.837797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.837846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.837875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.837890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.837902 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.851490 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.855347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnwtc\" (UniqueName: \"kubernetes.io/projected/434a2452-dc92-41c5-9236-02bd0d70b401-kube-api-access-dnwtc\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.855434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/434a2452-dc92-41c5-9236-02bd0d70b401-host\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.855484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/434a2452-dc92-41c5-9236-02bd0d70b401-serviceca\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.865646 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.879125 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.894898 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.912814 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941478 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:12Z","lastTransitionTime":"2026-03-17T01:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.941830 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.955551 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.956121 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/434a2452-dc92-41c5-9236-02bd0d70b401-serviceca\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.957477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/434a2452-dc92-41c5-9236-02bd0d70b401-serviceca\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.956164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnwtc\" (UniqueName: \"kubernetes.io/projected/434a2452-dc92-41c5-9236-02bd0d70b401-kube-api-access-dnwtc\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.957606 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/434a2452-dc92-41c5-9236-02bd0d70b401-host\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.957680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/434a2452-dc92-41c5-9236-02bd0d70b401-host\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.971573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:12Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:12 crc kubenswrapper[4735]: I0317 01:11:12.974071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnwtc\" (UniqueName: \"kubernetes.io/projected/434a2452-dc92-41c5-9236-02bd0d70b401-kube-api-access-dnwtc\") pod \"node-ca-gt4rx\" (UID: \"434a2452-dc92-41c5-9236-02bd0d70b401\") " pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.044216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.044251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.044259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.044275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.044286 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.079818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gt4rx" Mar 17 01:11:13 crc kubenswrapper[4735]: W0317 01:11:13.125499 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434a2452_dc92_41c5_9236_02bd0d70b401.slice/crio-f8947475712473e0cb0b8d6a7c94d01389194ca5a810f1159c7f4184b9101f6b WatchSource:0}: Error finding container f8947475712473e0cb0b8d6a7c94d01389194ca5a810f1159c7f4184b9101f6b: Status 404 returned error can't find the container with id f8947475712473e0cb0b8d6a7c94d01389194ca5a810f1159c7f4184b9101f6b Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.147142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.147182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.147195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.147215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.147226 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.250135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.250195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.250210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.250234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.250249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.352978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.353026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.353040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.353062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.353078 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.456337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.456814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.456823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.456894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.456905 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.551140 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c" exitCode=0 Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.551241 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.554699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2s9p2" event={"ID":"9d065a5d-f163-4cbd-8790-023a32481e7b","Type":"ContainerStarted","Data":"590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560440 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerStarted","Data":"c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560519 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerStarted","Data":"f540d19e7e7dae250aeb506587b91ce4027e3ab85f54ea60bb29ceadedbc7f8f"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.560915 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.563647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gt4rx" event={"ID":"434a2452-dc92-41c5-9236-02bd0d70b401","Type":"ContainerStarted","Data":"e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.563720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gt4rx" event={"ID":"434a2452-dc92-41c5-9236-02bd0d70b401","Type":"ContainerStarted","Data":"f8947475712473e0cb0b8d6a7c94d01389194ca5a810f1159c7f4184b9101f6b"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568237 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.568328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.573148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.573321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.573473 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"a9c8154a4f360841f6f7b94ffc14a52ef7983e66eeb311de4b765167e43f16ed"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.592060 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.611152 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.629324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.649625 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.665446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.665493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.665507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.666207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.666254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.667686 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.683494 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.701387 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.718587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.743542 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.757066 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.769029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.769085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.769100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.769122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.769139 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.774331 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.789299 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.806729 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.818580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.830325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.843462 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.868744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.871409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.871508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.871583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.871659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.871716 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.885539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.906984 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.922144 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.936261 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.948096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.960823 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.974407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.974452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.974468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.974484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.974494 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:13Z","lastTransitionTime":"2026-03-17T01:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:13 crc kubenswrapper[4735]: I0317 01:11:13.976917 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:13Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.072604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:14 crc kubenswrapper[4735]: E0317 01:11:14.072803 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.073409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:14 crc kubenswrapper[4735]: E0317 01:11:14.073518 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.073602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:14 crc kubenswrapper[4735]: E0317 01:11:14.073692 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.076988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.077028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.077045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.077069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.077087 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.179646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.179701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.179721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.179746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.179765 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.284459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.284524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.284536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.284558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.284572 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.389051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.389139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.389157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.389185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.389261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.492709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.492752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.492763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.492781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.492792 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.579739 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c" exitCode=0 Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.580941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.595303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.595343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.595363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.595391 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.595411 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.629456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.654587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.699136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.699251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.699266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.699289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.699302 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.700371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.727522 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.741505 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.753955 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.764099 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.775612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.786702 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.797740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.801831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.801892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.801906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.801927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.801942 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.809013 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.822407 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:14Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.905811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.905950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.905972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.906003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:14 crc kubenswrapper[4735]: I0317 01:11:14.906024 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:14Z","lastTransitionTime":"2026-03-17T01:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.014927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.014975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.014993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.015018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.015035 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.091847 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.111103 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.117671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.117722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.117733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.117757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.117768 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.129562 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.157182 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.172328 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.188668 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.207044 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.221430 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.235793 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.247357 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.264414 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.803780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.803969 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:23.803940484 +0000 UTC m=+109.436173462 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.806157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.806206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.806215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.806231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.806241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.812663 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713" exitCode=0 Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.812737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.825300 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.853851 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.872636 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.889288 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.903388 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.904851 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.904941 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.904982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.905014 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905045 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905225 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905257 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905252 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905275 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905290 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905303 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.905353 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.906039 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:23.905093431 +0000 UTC m=+109.537326409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.906156 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:23.906144176 +0000 UTC m=+109.538377154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.906303 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:23.90629008 +0000 UTC m=+109.538523058 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.906353 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:23.906344551 +0000 UTC m=+109.538577529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.909060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.909089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.909107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.909133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.909144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.923630 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.952608 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.957752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.957786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.957796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.957811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.957820 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.973832 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.974156 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.980727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.980749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.980759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.980775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.980783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:15Z","lastTransitionTime":"2026-03-17T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:15 crc kubenswrapper[4735]: I0317 01:11:15.989766 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:15 crc kubenswrapper[4735]: E0317 01:11:15.994319 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.003732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.004183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.004199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.004222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.004241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.013913 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.026539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.028232 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.033585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.033657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.033670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.033876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.033898 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.042888 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.048575 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.051448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.051485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.051495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.051511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.051536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.058445 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.062532 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.062649 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.064540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.064632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.064720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.064805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.064888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.072882 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.073072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.073153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.073254 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.073622 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:16 crc kubenswrapper[4735]: E0317 01:11:16.073771 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.167331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.167576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.167680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.167778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.167891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.270723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.270824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.270847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.270915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.270934 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.374780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.375009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.375259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.375453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.375627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.479955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.480277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.480402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.480538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.480662 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.584423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.584764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.585190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.585447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.585655 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.689353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.689418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.689436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.689463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.689482 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.792819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.793297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.793482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.793665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.793837 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.820996 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664" exitCode=0 Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.821065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.828153 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.859634 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.881187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.896819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.896899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.896912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.896932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.896946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:16Z","lastTransitionTime":"2026-03-17T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.903591 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.930268 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.944472 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.960418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.984564 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:16 crc kubenswrapper[4735]: I0317 01:11:16.996230 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:16Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:16.999952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.000029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.001710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.001735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.001748 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.011219 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.027571 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.043004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.057165 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.089899 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.090391 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.104742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.105078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.105118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.105142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.105158 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.208472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.208524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.208536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.208555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.208565 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.312277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.312322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.312333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.312353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.312366 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.416092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.416165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.416191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.416228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.416253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.520036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.520092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.520109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.520131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.520144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.623555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.623603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.623612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.623632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.623641 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.726722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.726799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.726818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.726852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.726901 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.838736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.838810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.838823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.838845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.838873 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.848639 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.851585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.852156 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.856414 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee" exitCode=0 Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.856473 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.879119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.899060 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.943049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.943111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.943133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.943160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.943180 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:17Z","lastTransitionTime":"2026-03-17T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.945134 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.973653 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:17 crc kubenswrapper[4735]: I0317 01:11:17.998628 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.026449 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.046667 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.064342 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.073250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.073313 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.073333 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:18 crc kubenswrapper[4735]: E0317 01:11:18.073446 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:18 crc kubenswrapper[4735]: E0317 01:11:18.073601 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:18 crc kubenswrapper[4735]: E0317 01:11:18.073776 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.083753 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.102262 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.117458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.134088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.152787 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.153518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.153554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.153566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.153583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.153595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.164376 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.182961 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.203192 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.219067 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.236672 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.258972 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.265965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.266017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.266028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.266047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.266061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.282069 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.295206 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.309917 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.328085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.338705 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.353015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.369807 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.471481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.471526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.471535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.471574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.471587 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.575215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.575270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.575285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.575308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.575324 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.678570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.678616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.678626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.678644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.678656 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.781684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.782154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.782186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.782221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.782246 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.871103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.871360 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.871448 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.871525 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.879122 4735 generic.go:334] "Generic (PLEG): container finished" podID="26e9b32a-eeef-45ee-8107-2a94625bbf6b" containerID="410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831" exitCode=0 Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.879940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerDied","Data":"410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.903537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.903591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.903606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.903632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.903650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:18Z","lastTransitionTime":"2026-03-17T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.914668 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.952470 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.964560 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.967761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:18 crc kubenswrapper[4735]: I0317 01:11:18.992337 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:18Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.009368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.009423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.009436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.009462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.009477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.015571 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.036080 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.050551 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.061997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.083414 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.094050 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.104291 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.111822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.111879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.111891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.111909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.111920 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.117884 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.127812 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.142021 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.161027 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.179510 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.200106 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.210727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.214583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.214613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.214622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.214638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.214649 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.224009 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.239241 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.251612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.263499 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.277919 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.294178 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.309387 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.317185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.317235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.317245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.317263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.317276 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.322540 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.343537 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.420307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.420647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.420733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.420827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.420967 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.523462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.523507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.523517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.523532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.523545 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.626823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.627303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.627458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.627607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.627799 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.731194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.731255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.731276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.731303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.731322 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.835050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.835134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.835158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.835199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.835224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.889225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" event={"ID":"26e9b32a-eeef-45ee-8107-2a94625bbf6b","Type":"ContainerStarted","Data":"0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.910408 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.931371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.938395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.938453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.938467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.938493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.938507 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:19Z","lastTransitionTime":"2026-03-17T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.955989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.972400 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:19 crc kubenswrapper[4735]: I0317 01:11:19.991158 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:19Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.014026 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.033816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.041146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.041205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.041257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.041290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.041315 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.058105 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.072147 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:20 crc kubenswrapper[4735]: E0317 01:11:20.072348 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.072162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:20 crc kubenswrapper[4735]: E0317 01:11:20.072481 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.072147 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:20 crc kubenswrapper[4735]: E0317 01:11:20.072572 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.084798 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.101816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.122010 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.141936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.144249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.144324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.144343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.144378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.144396 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.186136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:20Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.247340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.247404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.247424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.247451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.247469 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.350554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.350913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.351053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.351248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.351376 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.454435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.454788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.454987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.455128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.455294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.558499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.558555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.558565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.558582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.558592 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.661005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.661092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.661105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.661124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.661136 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.764674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.764724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.764743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.764767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.764784 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.867716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.867786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.867804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.867836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.867895 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.970735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.970787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.970805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.970829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:20 crc kubenswrapper[4735]: I0317 01:11:20.970846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:20Z","lastTransitionTime":"2026-03-17T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.074287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.074321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.074332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.074351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.074365 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.177888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.177952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.177968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.177991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.178003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.282148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.282720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.282744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.282783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.282805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.386426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.386480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.386495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.386527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.386545 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.490348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.490541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.490566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.490597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.490617 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.596671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.596727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.596745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.596771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.596789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.700344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.700402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.700420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.700445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.700460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.803737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.803789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.803800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.803817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.803830 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.899660 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/0.log" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.903515 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5" exitCode=1 Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.903580 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.904841 4735 scope.go:117] "RemoveContainer" containerID="99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.907501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.907535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.907549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.907566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.907581 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:21Z","lastTransitionTime":"2026-03-17T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.942152 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:21Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.959495 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:21Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:21 crc kubenswrapper[4735]: I0317 01:11:21.974686 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:21Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.001029 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:21Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.011104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.011157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.011169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.011188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.011201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.022153 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.041744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.056219 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.072339 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.072370 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:22 crc kubenswrapper[4735]: E0317 01:11:22.072520 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.072537 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:22 crc kubenswrapper[4735]: E0317 01:11:22.072651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:22 crc kubenswrapper[4735]: E0317 01:11:22.072728 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.089267 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:21Z\\\",\\\"message\\\":\\\"nt handler 2\\\\nI0317 01:11:21.220542 6384 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 01:11:21.220779 6384 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221037 6384 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221205 6384 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221448 6384 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221731 6384 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221996 6384 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0317 01:11:21.222304 6384 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.102801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.114063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.114122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.114137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.114158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.114173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.116064 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.133108 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.147296 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.161714 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.216847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.216924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.216934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.216951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.216961 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.323747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.323823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.323842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.323902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.323926 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.427431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.427492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.427516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.427546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.427570 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.532178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.532241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.532257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.532322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.532346 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.634786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.634838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.634852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.634892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.634907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.738196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.738270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.738288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.738323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.738343 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.841409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.841498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.841515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.841549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.841569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.909783 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/0.log" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.913404 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.914129 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.931714 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.944140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.944471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.944635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.944816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.944994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:22Z","lastTransitionTime":"2026-03-17T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.947618 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.980335 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:21Z\\\",\\\"message\\\":\\\"nt handler 2\\\\nI0317 01:11:21.220542 6384 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 01:11:21.220779 6384 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221037 6384 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221205 6384 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221448 6384 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221731 6384 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221996 6384 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0317 01:11:21.222304 6384 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:22 crc kubenswrapper[4735]: I0317 01:11:22.998376 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:22Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.021389 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.041001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.048450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.048509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.048527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.048554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.048573 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.057968 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.073649 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.092900 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.110370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.128941 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.147080 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.151208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.151274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.151292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.151319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.151337 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.166220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.254055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.254118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.254133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.254154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.254167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.356969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.357060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.357084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.357123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.357147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.468647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.468786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.468904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.468995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.469026 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.573768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.573850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.573897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.573923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.573943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.677974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.678012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.678020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.678036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.678049 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.782242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.782345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.782371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.782413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.782439 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.885934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.886011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.886031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.886063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.886086 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.905004 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.905265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:23 crc kubenswrapper[4735]: E0317 01:11:23.905441 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:23 crc kubenswrapper[4735]: E0317 01:11:23.905617 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:39.905571625 +0000 UTC m=+125.537804663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:23 crc kubenswrapper[4735]: E0317 01:11:23.905923 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:11:39.905842121 +0000 UTC m=+125.538075139 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.940404 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/1.log" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.941834 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/0.log" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.947434 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28" exitCode=1 Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.947504 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28"} Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.947565 4735 scope.go:117] "RemoveContainer" containerID="99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.948840 4735 scope.go:117] "RemoveContainer" containerID="d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28" Mar 17 01:11:23 crc kubenswrapper[4735]: E0317 01:11:23.949172 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.984313 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:23Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.992359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.992461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.992483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.992557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:23 crc kubenswrapper[4735]: I0317 01:11:23.992575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:23Z","lastTransitionTime":"2026-03-17T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.006045 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.006318 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.006360 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.006385 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.006468 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:40.006438441 +0000 UTC m=+125.638671469 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.006145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.006751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.006905 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007166 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007228 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007246 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007330 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:40.007279861 +0000 UTC m=+125.639512869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007366 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.007444 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:11:40.007422095 +0000 UTC m=+125.639655163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.023809 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.040222 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.060603 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.072743 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.072823 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.072910 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.072956 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.073131 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.073253 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.087411 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.095781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.095853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.095919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.095954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.095981 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.110250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.129919 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.155936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.177016 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.196425 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.199331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.199395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.199414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.199439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.199458 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.216054 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.249999 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:21Z\\\",\\\"message\\\":\\\"nt handler 2\\\\nI0317 01:11:21.220542 6384 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 01:11:21.220779 6384 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221037 6384 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221205 6384 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221448 6384 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221731 6384 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221996 6384 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0317 01:11:21.222304 6384 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.302969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.303073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.303092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.303119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.303140 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.330701 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw"] Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.331699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.335190 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.336509 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.361382 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.382264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.406144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.406201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.406218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.406244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.406299 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.409745 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.410021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfmlm\" (UniqueName: \"kubernetes.io/projected/6577add2-4376-4c19-b025-31caf2138e36-kube-api-access-pfmlm\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.410112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.410145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.410215 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6577add2-4376-4c19-b025-31caf2138e36-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.427062 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.459201 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99819272bf061dbd60e1358a19e3f46462a913448a7321522d7a92020646e4c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:21Z\\\",\\\"message\\\":\\\"nt handler 2\\\\nI0317 01:11:21.220542 6384 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 01:11:21.220779 6384 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221037 6384 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221205 6384 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:21.221448 6384 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221731 6384 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 01:11:21.221996 6384 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0317 01:11:21.222304 6384 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.476736 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.497489 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510145 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6577add2-4376-4c19-b025-31caf2138e36-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfmlm\" (UniqueName: \"kubernetes.io/projected/6577add2-4376-4c19-b025-31caf2138e36-kube-api-access-pfmlm\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.510978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.512489 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.512775 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6577add2-4376-4c19-b025-31caf2138e36-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.521488 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6577add2-4376-4c19-b025-31caf2138e36-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.522444 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.538978 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.546653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfmlm\" (UniqueName: \"kubernetes.io/projected/6577add2-4376-4c19-b025-31caf2138e36-kube-api-access-pfmlm\") pod \"ovnkube-control-plane-749d76644c-fl2dw\" (UID: \"6577add2-4376-4c19-b025-31caf2138e36\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.561663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.583816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.613390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.613879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.613892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.613914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.613932 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.616309 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.637212 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.650385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.656294 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.716922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.716979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.716998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.717028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.717045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.824382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.824447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.824472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.824505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.824526 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.927229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.927271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.927284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.927304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.927317 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:24Z","lastTransitionTime":"2026-03-17T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.958207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" event={"ID":"6577add2-4376-4c19-b025-31caf2138e36","Type":"ContainerStarted","Data":"b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.958281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" event={"ID":"6577add2-4376-4c19-b025-31caf2138e36","Type":"ContainerStarted","Data":"8df89184544137241b2d5ff2450877c663864db87dd33cb526a48ee7b05e578a"} Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.960424 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/1.log" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.963716 4735 scope.go:117] "RemoveContainer" containerID="d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28" Mar 17 01:11:24 crc kubenswrapper[4735]: E0317 01:11:24.963924 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.982251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:24 crc kubenswrapper[4735]: I0317 01:11:24.998834 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:24Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.020956 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.030305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.030355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.030369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.030390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.030404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.040800 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.063846 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.086646 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.104063 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.122383 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.132773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.132828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.132851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.132899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.132916 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.141418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.167465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.197217 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.219526 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.235764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.235810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.235823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.235842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.235872 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.253343 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.264469 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.278746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.300543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.309358 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.320159 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.333186 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.338746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.338802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.338825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.338878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.338898 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.342891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.361379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.375006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.387816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.406064 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.417480 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.432847 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.441601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.441647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.441660 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.441681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.441694 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.446568 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.468554 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.544200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.544267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.544284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.544313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.544331 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.647110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.647362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.647433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.647542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.647604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.751176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.751466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.751531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.751596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.751689 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.854849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.854975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.854993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.855022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.855040 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.958189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.958260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.958279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.958307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.958324 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:25Z","lastTransitionTime":"2026-03-17T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.969363 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" event={"ID":"6577add2-4376-4c19-b025-31caf2138e36","Type":"ContainerStarted","Data":"927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0"} Mar 17 01:11:25 crc kubenswrapper[4735]: I0317 01:11:25.990401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:25Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.011210 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.030797 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.055471 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.061105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.061163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.061182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.061209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.061231 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.072898 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.072947 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.073130 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.073001 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.073305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.073498 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.081564 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.100985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.118743 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.138712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.163000 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.165113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.165181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.165200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.165228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.165247 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.185327 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.204696 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.223337 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.257586 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dkwf5"] Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.258302 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.258389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.268532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.268560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.268571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.268585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.268596 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.278300 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.301039 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.317922 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.332138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.332177 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzxp\" (UniqueName: \"kubernetes.io/projected/3a72fe2c-32fb-4360-882b-44debb825c9e-kube-api-access-zdzxp\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.351466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.354968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.355002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.355014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.355031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.355042 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.364420 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.368562 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.377802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.377914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.377930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.377956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.377976 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.382161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.395624 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.399375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.399406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.399415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.399431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.399443 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.401160 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.412505 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.415280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.415302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.415309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.415325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.415336 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.417883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.429423 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.430302 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.433224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.433275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzxp\" (UniqueName: \"kubernetes.io/projected/3a72fe2c-32fb-4360-882b-44debb825c9e-kube-api-access-zdzxp\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.433394 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.433464 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:26.93344518 +0000 UTC m=+112.565678158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.435984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.436038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.436053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.436075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.436088 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.445347 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.451770 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.451972 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.454214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.454271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.454283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.454319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.454334 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.458660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzxp\" (UniqueName: \"kubernetes.io/projected/3a72fe2c-32fb-4360-882b-44debb825c9e-kube-api-access-zdzxp\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.461218 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.477599 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.492264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.506577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.519275 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.539997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.557694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.557785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.557805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.557829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.557848 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.566446 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:26Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.661711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.661788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.661811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.661842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.661896 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.765382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.765453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.765472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.765500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.765524 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.869057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.869123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.869141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.869167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.869189 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.939553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.939829 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:26 crc kubenswrapper[4735]: E0317 01:11:26.940005 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:27.939970606 +0000 UTC m=+113.572203634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.973058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.973120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.973140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.973169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:26 crc kubenswrapper[4735]: I0317 01:11:26.973187 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:26Z","lastTransitionTime":"2026-03-17T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.076698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.076773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.076793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.076842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.076909 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.180842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.180973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.180992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.181019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.181038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.285126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.285251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.285278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.285313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.285336 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.388842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.388937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.388957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.388984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.389003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.492395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.492456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.492472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.492495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.492508 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.596418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.596496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.596520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.596551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.596595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.700533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.700622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.700645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.700676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.700699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.810630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.810696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.810717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.810745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.810762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.914574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.914639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.914655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.914681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.914699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:27Z","lastTransitionTime":"2026-03-17T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:27 crc kubenswrapper[4735]: I0317 01:11:27.951599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:27 crc kubenswrapper[4735]: E0317 01:11:27.951894 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:27 crc kubenswrapper[4735]: E0317 01:11:27.952001 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:29.951970122 +0000 UTC m=+115.584203130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.017969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.018019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.018034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.018052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.018065 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.073027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.073089 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.073157 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:28 crc kubenswrapper[4735]: E0317 01:11:28.073190 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.073209 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:28 crc kubenswrapper[4735]: E0317 01:11:28.073314 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:28 crc kubenswrapper[4735]: E0317 01:11:28.073451 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:28 crc kubenswrapper[4735]: E0317 01:11:28.073541 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.122527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.122586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.122606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.122633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.122650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.225169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.225270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.225295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.225332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.225363 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.328717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.328799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.328825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.328903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.328933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.432011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.432071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.432085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.432104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.432118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.535538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.535601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.535631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.535664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.535684 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.638822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.638926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.638953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.638983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.639004 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.741695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.741753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.741773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.741797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.741816 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.845062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.845136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.845186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.845217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.845238 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.949237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.949316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.949345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.949377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:28 crc kubenswrapper[4735]: I0317 01:11:28.949399 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:28Z","lastTransitionTime":"2026-03-17T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.058259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.058323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.058341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.058369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.058386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.161650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.161762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.161780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.161803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.161821 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.264093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.264131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.264141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.264157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.264170 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.367332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.367369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.367380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.367397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.367409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.469401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.469442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.469454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.469474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.469486 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.572373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.572429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.572447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.572472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.572492 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.676157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.676219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.676237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.676264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.676283 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.779463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.779514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.779525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.779543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.779555 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.882776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.882824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.882837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.882908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.882918 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.973424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:29 crc kubenswrapper[4735]: E0317 01:11:29.973716 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:29 crc kubenswrapper[4735]: E0317 01:11:29.973874 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:33.973824154 +0000 UTC m=+119.606057152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.985402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.985436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.985444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.985458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:29 crc kubenswrapper[4735]: I0317 01:11:29.985468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:29Z","lastTransitionTime":"2026-03-17T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.072948 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:30 crc kubenswrapper[4735]: E0317 01:11:30.073147 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.073228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.073228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.073276 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:30 crc kubenswrapper[4735]: E0317 01:11:30.073378 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:30 crc kubenswrapper[4735]: E0317 01:11:30.073468 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:30 crc kubenswrapper[4735]: E0317 01:11:30.073621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.089300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.089369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.089386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.089415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.089438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.192495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.192562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.192589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.192622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.192648 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.295568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.295628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.295650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.295678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.295701 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.361520 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.379410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.395773 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.398273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.398414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.398498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.398616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.398696 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.410533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.429156 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.443284 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.463710 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.484897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.500244 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.503197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.503381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.503499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.503656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.503774 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.529830 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.547144 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.564877 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.586486 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.599637 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.606195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.606226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.606238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.606256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.606267 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.621997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.642105 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:30Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.709894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.709969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.709995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.710025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.710046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.813004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.813070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.813087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.813111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.813133 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.916418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.916478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.916496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.916522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:30 crc kubenswrapper[4735]: I0317 01:11:30.916543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:30Z","lastTransitionTime":"2026-03-17T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.019424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.019485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.019503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.019526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.019543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.122429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.122496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.122515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.122540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.122562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.225403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.225478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.225495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.225527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.225549 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.329836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.329931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.329950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.329979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.329997 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.432995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.433046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.433110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.433155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.433193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.536309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.536372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.536390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.536416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.536434 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.640263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.640320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.640331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.640388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.640404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.743018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.743094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.743111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.743138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.743157 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.845837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.845943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.845964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.845989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.846009 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.950209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.950293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.950318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.950352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:31 crc kubenswrapper[4735]: I0317 01:11:31.950377 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:31Z","lastTransitionTime":"2026-03-17T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.053910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.053970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.053987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.054012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.054030 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.072399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.072444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.072406 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:32 crc kubenswrapper[4735]: E0317 01:11:32.072579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.072650 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:32 crc kubenswrapper[4735]: E0317 01:11:32.072782 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:32 crc kubenswrapper[4735]: E0317 01:11:32.072982 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:32 crc kubenswrapper[4735]: E0317 01:11:32.073182 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.157844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.157935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.157954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.157982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.158001 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.260706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.260733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.260742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.260757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.260766 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.363727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.363780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.363798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.363825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.363843 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.468799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.468901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.468928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.468962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.468985 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.572700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.572770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.572787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.572815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.572832 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.675371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.675452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.675475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.675506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.675530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.778283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.778352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.778376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.778401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.778420 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.881648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.881704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.881721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.881744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.881762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.985092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.985144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.985162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.985195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:32 crc kubenswrapper[4735]: I0317 01:11:32.985215 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:32Z","lastTransitionTime":"2026-03-17T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.088717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.089171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.089319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.089475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.089599 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.193329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.193462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.193481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.193506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.193523 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.296813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.296910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.296934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.296979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.297001 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.399451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.399517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.399535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.399562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.399580 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.503248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.503303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.503328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.503358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.503380 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.606299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.606367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.606388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.606420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.606440 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.709324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.709359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.709396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.709414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.709424 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.812889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.812967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.812985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.813036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.813054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.916337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.916392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.916411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.916441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:33 crc kubenswrapper[4735]: I0317 01:11:33.916460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:33Z","lastTransitionTime":"2026-03-17T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.019115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.019158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.019167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.019182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.019193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.020684 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.020848 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.020958 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:42.020933717 +0000 UTC m=+127.653166725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.072312 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.072318 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.072486 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.072323 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.072643 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.072706 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.073071 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:34 crc kubenswrapper[4735]: E0317 01:11:34.073314 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.133468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.133550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.133569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.134061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.134121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.239478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.239562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.239580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.239607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.239624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.343557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.344610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.344749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.344907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.345052 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.447689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.447765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.447784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.447811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.447830 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.551566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.551649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.551674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.551709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.551734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.654489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.654562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.654587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.654620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.654644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.758448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.758517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.758538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.758566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.758590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.861609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.861700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.861725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.861762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.861786 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.964944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.965032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.965078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.965105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:34 crc kubenswrapper[4735]: I0317 01:11:34.965124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:34Z","lastTransitionTime":"2026-03-17T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:35 crc kubenswrapper[4735]: E0317 01:11:35.065383 4735 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.073733 4735 scope.go:117] "RemoveContainer" containerID="d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.093928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.115475 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.132445 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.151748 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: E0317 01:11:35.153909 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.207495 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.229952 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.247073 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.262344 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.276385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.290395 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.303118 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.320770 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.336215 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.351821 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:35 crc kubenswrapper[4735]: I0317 01:11:35.364154 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:35Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.016566 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/1.log" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.020754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57"} Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.021688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.046647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.072254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.072451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.072827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.072951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.073000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.073035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.073205 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.073414 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.073692 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.102497 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.125813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.146093 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.171104 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.185881 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.204171 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.221189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.245426 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.262985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.279424 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.300366 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.313772 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.334607 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.651821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.651933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.651951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.651980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.652002 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:36Z","lastTransitionTime":"2026-03-17T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.674376 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.679604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.679707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.679726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.679752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.679773 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:36Z","lastTransitionTime":"2026-03-17T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.705915 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.713624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.713763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.713783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.713807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.713851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:36Z","lastTransitionTime":"2026-03-17T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.738059 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.746666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.746763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.746826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.746911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.746937 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:36Z","lastTransitionTime":"2026-03-17T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.774489 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.779457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.779541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.779561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.779620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:36 crc kubenswrapper[4735]: I0317 01:11:36.779639 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:36Z","lastTransitionTime":"2026-03-17T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.800149 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:36Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:36 crc kubenswrapper[4735]: E0317 01:11:36.800715 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.028380 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/2.log" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.029491 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/1.log" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.034005 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" exitCode=1 Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.034063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57"} Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.034115 4735 scope.go:117] "RemoveContainer" containerID="d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.035397 4735 scope.go:117] "RemoveContainer" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" Mar 17 01:11:37 crc kubenswrapper[4735]: E0317 01:11:37.035725 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.057321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.082132 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.088676 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.104631 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.128799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.150691 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.171412 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.190974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.213136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.237737 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.260268 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.277745 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.309610 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6ebd3fb43f0959f30b3b498fe730315519b7d29e7f646840e84432834416b28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:23Z\\\",\\\"message\\\":\\\"informers/externalversions/factory.go:141\\\\nI0317 01:11:23.030687 6586 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 01:11:23.031345 6586 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 01:11:23.031371 6586 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 01:11:23.031386 6586 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 01:11:23.031418 6586 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 01:11:23.031427 6586 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0317 01:11:23.031438 6586 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 01:11:23.031437 6586 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 01:11:23.031446 6586 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 01:11:23.031457 6586 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 01:11:23.031460 6586 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 01:11:23.031472 6586 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 01:11:23.031505 6586 factory.go:656] Stopping watch factory\\\\nI0317 01:11:23.031534 6586 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 01:11:23.031543 6586 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.324514 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.350533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:37 crc kubenswrapper[4735]: I0317 01:11:37.374120 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:37Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.040446 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/2.log" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.046580 4735 scope.go:117] "RemoveContainer" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" Mar 17 01:11:38 crc kubenswrapper[4735]: E0317 01:11:38.047033 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.067521 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.072168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.072191 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.072206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:38 crc kubenswrapper[4735]: E0317 01:11:38.072423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:38 crc kubenswrapper[4735]: E0317 01:11:38.072562 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.072680 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:38 crc kubenswrapper[4735]: E0317 01:11:38.072828 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:38 crc kubenswrapper[4735]: E0317 01:11:38.073002 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.092636 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.109948 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.134301 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.154773 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.176202 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.201640 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.218410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.231609 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.245281 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.268058 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.285819 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.307924 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.322448 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.344755 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:38 crc kubenswrapper[4735]: I0317 01:11:38.358675 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:38Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:39 crc kubenswrapper[4735]: I0317 01:11:39.912348 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:11:39 crc kubenswrapper[4735]: I0317 01:11:39.912598 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:39 crc kubenswrapper[4735]: E0317 01:11:39.912746 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:39 crc kubenswrapper[4735]: E0317 01:11:39.912830 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:12:11.912805819 +0000 UTC m=+157.545038837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:11:39 crc kubenswrapper[4735]: E0317 01:11:39.913201 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:12:11.913156597 +0000 UTC m=+157.545389605 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.013475 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.013833 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014075 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:12:12.014031974 +0000 UTC m=+157.646265052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.013921 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.014276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014576 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014630 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014651 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014737 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:12:12.014708471 +0000 UTC m=+157.646941479 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.014582 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.015125 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.015267 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.015477 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:12:12.015440118 +0000 UTC m=+157.647673136 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.072338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.072438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.072501 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.072938 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.073408 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.073084 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.073760 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.073961 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:40 crc kubenswrapper[4735]: I0317 01:11:40.091007 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 17 01:11:40 crc kubenswrapper[4735]: E0317 01:11:40.155137 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:11:42 crc kubenswrapper[4735]: I0317 01:11:42.038840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.039224 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.039375 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:11:58.039336747 +0000 UTC m=+143.671569775 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:42 crc kubenswrapper[4735]: I0317 01:11:42.073072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:42 crc kubenswrapper[4735]: I0317 01:11:42.073125 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:42 crc kubenswrapper[4735]: I0317 01:11:42.073149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:42 crc kubenswrapper[4735]: I0317 01:11:42.073072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.073297 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.073465 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.073594 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:42 crc kubenswrapper[4735]: E0317 01:11:42.073722 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:44 crc kubenswrapper[4735]: I0317 01:11:44.072453 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:44 crc kubenswrapper[4735]: I0317 01:11:44.072489 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:44 crc kubenswrapper[4735]: E0317 01:11:44.074241 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:44 crc kubenswrapper[4735]: I0317 01:11:44.072615 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:44 crc kubenswrapper[4735]: I0317 01:11:44.072570 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:44 crc kubenswrapper[4735]: E0317 01:11:44.074385 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:44 crc kubenswrapper[4735]: E0317 01:11:44.074501 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:44 crc kubenswrapper[4735]: E0317 01:11:44.074622 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.099687 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.122386 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.142770 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: E0317 01:11:45.156912 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.179560 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.197651 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.219623 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.241993 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.259012 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.283663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.312911 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.334138 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.354342 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.373595 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.402283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.423791 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.450368 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:45 crc kubenswrapper[4735]: I0317 01:11:45.469971 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:45Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.072136 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.072174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.072207 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.072156 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.072334 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.072645 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.072615 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.072761 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.951485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.951543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.951580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.951614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.951657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:46Z","lastTransitionTime":"2026-03-17T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.969641 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:46Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.973802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.973828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.973838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.973868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:46 crc kubenswrapper[4735]: I0317 01:11:46.973884 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:46Z","lastTransitionTime":"2026-03-17T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:46 crc kubenswrapper[4735]: E0317 01:11:46.993508 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:46Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.001195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.001268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.001296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.001329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.001348 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:47Z","lastTransitionTime":"2026-03-17T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:47 crc kubenswrapper[4735]: E0317 01:11:47.025075 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:47Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.030913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.030972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.030991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.031020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.031037 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:47Z","lastTransitionTime":"2026-03-17T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:47 crc kubenswrapper[4735]: E0317 01:11:47.050300 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:47Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.056550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.056600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.056617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.056641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:47 crc kubenswrapper[4735]: I0317 01:11:47.056661 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:47Z","lastTransitionTime":"2026-03-17T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:47 crc kubenswrapper[4735]: E0317 01:11:47.078410 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:47Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:47 crc kubenswrapper[4735]: E0317 01:11:47.078569 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:48 crc kubenswrapper[4735]: I0317 01:11:48.072788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:48 crc kubenswrapper[4735]: I0317 01:11:48.072846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:48 crc kubenswrapper[4735]: I0317 01:11:48.072904 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:48 crc kubenswrapper[4735]: E0317 01:11:48.073068 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:48 crc kubenswrapper[4735]: I0317 01:11:48.073149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:48 crc kubenswrapper[4735]: E0317 01:11:48.073292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:48 crc kubenswrapper[4735]: E0317 01:11:48.073422 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:48 crc kubenswrapper[4735]: E0317 01:11:48.073601 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:49 crc kubenswrapper[4735]: I0317 01:11:49.074066 4735 scope.go:117] "RemoveContainer" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" Mar 17 01:11:49 crc kubenswrapper[4735]: E0317 01:11:49.074447 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:11:50 crc kubenswrapper[4735]: I0317 01:11:50.072399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:50 crc kubenswrapper[4735]: I0317 01:11:50.072440 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:50 crc kubenswrapper[4735]: E0317 01:11:50.072604 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:50 crc kubenswrapper[4735]: I0317 01:11:50.072689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:50 crc kubenswrapper[4735]: I0317 01:11:50.072788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:50 crc kubenswrapper[4735]: E0317 01:11:50.072963 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:50 crc kubenswrapper[4735]: E0317 01:11:50.073079 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:50 crc kubenswrapper[4735]: E0317 01:11:50.073194 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:50 crc kubenswrapper[4735]: E0317 01:11:50.157919 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:11:52 crc kubenswrapper[4735]: I0317 01:11:52.072837 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:52 crc kubenswrapper[4735]: I0317 01:11:52.073016 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:52 crc kubenswrapper[4735]: E0317 01:11:52.073095 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:52 crc kubenswrapper[4735]: I0317 01:11:52.072839 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:52 crc kubenswrapper[4735]: E0317 01:11:52.073215 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:52 crc kubenswrapper[4735]: I0317 01:11:52.072920 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:52 crc kubenswrapper[4735]: E0317 01:11:52.073358 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:52 crc kubenswrapper[4735]: E0317 01:11:52.073554 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:54 crc kubenswrapper[4735]: I0317 01:11:54.072149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:54 crc kubenswrapper[4735]: I0317 01:11:54.072201 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:54 crc kubenswrapper[4735]: I0317 01:11:54.072200 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:54 crc kubenswrapper[4735]: E0317 01:11:54.073316 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:54 crc kubenswrapper[4735]: E0317 01:11:54.073097 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:54 crc kubenswrapper[4735]: E0317 01:11:54.073427 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:54 crc kubenswrapper[4735]: I0317 01:11:54.072261 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:54 crc kubenswrapper[4735]: E0317 01:11:54.073603 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.089763 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.114897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.136004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.155812 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: E0317 01:11:55.159062 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.183238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.205605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.229835 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.250593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.271812 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.287612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.302382 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.321136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.332715 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.345897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.361793 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.383466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:55 crc kubenswrapper[4735]: I0317 01:11:55.400553 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:55Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:56 crc kubenswrapper[4735]: I0317 01:11:56.072767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:56 crc kubenswrapper[4735]: I0317 01:11:56.072909 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:56 crc kubenswrapper[4735]: I0317 01:11:56.072926 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:56 crc kubenswrapper[4735]: I0317 01:11:56.072800 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:56 crc kubenswrapper[4735]: E0317 01:11:56.073071 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:56 crc kubenswrapper[4735]: E0317 01:11:56.073333 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:11:56 crc kubenswrapper[4735]: E0317 01:11:56.073508 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:56 crc kubenswrapper[4735]: E0317 01:11:56.073606 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.105406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.106716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.106975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.107269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.107493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:57Z","lastTransitionTime":"2026-03-17T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.135275 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:57Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.142054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.142118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.142137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.142168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.142187 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:57Z","lastTransitionTime":"2026-03-17T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.166097 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:57Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.172447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.172709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.172964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.173195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.173419 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:57Z","lastTransitionTime":"2026-03-17T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.197280 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:57Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.202951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.203022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.203046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.203073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.203091 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:57Z","lastTransitionTime":"2026-03-17T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.224847 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:57Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.230645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.230722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.230746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.230778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:11:57 crc kubenswrapper[4735]: I0317 01:11:57.230796 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:11:57Z","lastTransitionTime":"2026-03-17T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.254032 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:11:57Z is after 2025-08-24T17:21:41Z" Mar 17 01:11:57 crc kubenswrapper[4735]: E0317 01:11:57.254621 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:11:58 crc kubenswrapper[4735]: I0317 01:11:58.049946 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.050235 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.050730 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:12:30.050692509 +0000 UTC m=+175.682925517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:11:58 crc kubenswrapper[4735]: I0317 01:11:58.072091 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.072292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:11:58 crc kubenswrapper[4735]: I0317 01:11:58.072364 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:11:58 crc kubenswrapper[4735]: I0317 01:11:58.072483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.072504 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.072694 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:11:58 crc kubenswrapper[4735]: I0317 01:11:58.072984 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:11:58 crc kubenswrapper[4735]: E0317 01:11:58.074230 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.072407 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:00 crc kubenswrapper[4735]: E0317 01:12:00.073507 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.072456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:00 crc kubenswrapper[4735]: E0317 01:12:00.073813 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.072448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.072647 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:00 crc kubenswrapper[4735]: E0317 01:12:00.074121 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:00 crc kubenswrapper[4735]: E0317 01:12:00.074441 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.137062 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/0.log" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.137128 4735 generic.go:334] "Generic (PLEG): container finished" podID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" containerID="c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416" exitCode=1 Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.137169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerDied","Data":"c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416"} Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.137629 4735 scope.go:117] "RemoveContainer" containerID="c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.157533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: E0317 01:12:00.160785 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.184143 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.203768 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.226387 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.274842 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.301340 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.325599 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.336539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.355825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.366290 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.376928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.388897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.400260 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.409678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.420254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.431693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:00 crc kubenswrapper[4735]: I0317 01:12:00.442233 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:00Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.073357 4735 scope.go:117] "RemoveContainer" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.142588 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/0.log" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.142651 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerStarted","Data":"e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1"} Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.160202 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.171507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.181911 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.194005 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.203817 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.216161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.235720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.249757 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.259738 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.273931 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.288086 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.298345 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.311339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.328259 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.343935 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.370207 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:01 crc kubenswrapper[4735]: I0317 01:12:01.387849 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.072583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.072661 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.072687 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:02 crc kubenswrapper[4735]: E0317 01:12:02.072794 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.072944 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:02 crc kubenswrapper[4735]: E0317 01:12:02.073167 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:02 crc kubenswrapper[4735]: E0317 01:12:02.073373 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:02 crc kubenswrapper[4735]: E0317 01:12:02.073652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.150199 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/3.log" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.151290 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/2.log" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.156535 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" exitCode=1 Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.156557 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.156636 4735 scope.go:117] "RemoveContainer" containerID="c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.157777 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:12:02 crc kubenswrapper[4735]: E0317 01:12:02.158211 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.189148 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.211387 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.224638 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.244739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.260828 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.277374 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.301786 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"message\\\":\\\"ntroller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0317 01:12:01.926172 7046 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"8b82f026-5975-4a1b-bb18-08d5d51147ec\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.314231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.328525 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.346297 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.361715 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.379801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.400269 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.417375 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.438370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.458044 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:02 crc kubenswrapper[4735]: I0317 01:12:02.476481 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:02Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:03 crc kubenswrapper[4735]: I0317 01:12:03.162679 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/3.log" Mar 17 01:12:04 crc kubenswrapper[4735]: I0317 01:12:04.072969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:04 crc kubenswrapper[4735]: I0317 01:12:04.073007 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:04 crc kubenswrapper[4735]: I0317 01:12:04.072969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:04 crc kubenswrapper[4735]: I0317 01:12:04.073127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:04 crc kubenswrapper[4735]: E0317 01:12:04.073092 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:04 crc kubenswrapper[4735]: E0317 01:12:04.073174 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:04 crc kubenswrapper[4735]: E0317 01:12:04.073229 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:04 crc kubenswrapper[4735]: E0317 01:12:04.073278 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.095896 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.117538 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.138801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.159402 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: E0317 01:12:05.162109 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.177725 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.198001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.219321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.238957 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.260328 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.281953 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.297712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.328669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4c25b1f9634b1c11bd5981484bf6a4c156d1fbce7e6f1c0ead7f4db99179e57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:36Z\\\",\\\"message\\\":\\\"[]services.LB{}\\\\nI0317 01:11:36.108116 6803 services_controller.go:454] Service openshift-machine-api/control-plane-machine-set-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0317 01:11:36.108133 6803 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"message\\\":\\\"ntroller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0317 01:12:01.926172 7046 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"8b82f026-5975-4a1b-bb18-08d5d51147ec\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.343928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.366351 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.382464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.397532 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:05 crc kubenswrapper[4735]: I0317 01:12:05.421295 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:05Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:06 crc kubenswrapper[4735]: I0317 01:12:06.072481 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:06 crc kubenswrapper[4735]: I0317 01:12:06.072611 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:06 crc kubenswrapper[4735]: E0317 01:12:06.072985 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:06 crc kubenswrapper[4735]: I0317 01:12:06.073027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:06 crc kubenswrapper[4735]: I0317 01:12:06.073107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:06 crc kubenswrapper[4735]: E0317 01:12:06.073475 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:06 crc kubenswrapper[4735]: E0317 01:12:06.073628 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:06 crc kubenswrapper[4735]: E0317 01:12:06.073697 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:06 crc kubenswrapper[4735]: I0317 01:12:06.091358 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.473372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.473445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.473465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.473492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.473510 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:07Z","lastTransitionTime":"2026-03-17T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.496820 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:07Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.509596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.509656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.509673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.509696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.509713 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:07Z","lastTransitionTime":"2026-03-17T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.530528 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:07Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.536018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.536107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.536127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.536150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.536165 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:07Z","lastTransitionTime":"2026-03-17T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.557761 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:07Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.562766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.562854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.562985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.563017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.563067 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:07Z","lastTransitionTime":"2026-03-17T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.582993 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:07Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.588754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.588800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.588818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.588841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:07 crc kubenswrapper[4735]: I0317 01:12:07.588891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:07Z","lastTransitionTime":"2026-03-17T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.610479 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:07Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:07 crc kubenswrapper[4735]: E0317 01:12:07.610700 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:12:08 crc kubenswrapper[4735]: I0317 01:12:08.072441 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:08 crc kubenswrapper[4735]: I0317 01:12:08.072508 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:08 crc kubenswrapper[4735]: E0317 01:12:08.073071 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:08 crc kubenswrapper[4735]: I0317 01:12:08.072583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:08 crc kubenswrapper[4735]: E0317 01:12:08.073285 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:08 crc kubenswrapper[4735]: I0317 01:12:08.072583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:08 crc kubenswrapper[4735]: E0317 01:12:08.073605 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:08 crc kubenswrapper[4735]: E0317 01:12:08.073788 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:10 crc kubenswrapper[4735]: I0317 01:12:10.072456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:10 crc kubenswrapper[4735]: I0317 01:12:10.072472 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:10 crc kubenswrapper[4735]: E0317 01:12:10.073092 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:10 crc kubenswrapper[4735]: I0317 01:12:10.072543 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:10 crc kubenswrapper[4735]: E0317 01:12:10.072969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:10 crc kubenswrapper[4735]: I0317 01:12:10.072509 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:10 crc kubenswrapper[4735]: E0317 01:12:10.073286 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:10 crc kubenswrapper[4735]: E0317 01:12:10.073223 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:10 crc kubenswrapper[4735]: E0317 01:12:10.163919 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.568772 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.570887 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:12:11 crc kubenswrapper[4735]: E0317 01:12:11.571147 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.588126 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.611281 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.631507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb70d4f7-24ef-40bc-85e0-6b3aba532ebd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a8332dbfb1555aad08f74de3fd8d3ce4bc6fe8a386576a0a3383c06ba44c64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.650648 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.678319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.697554 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.716307 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.736258 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.756026 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.777978 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.798538 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.820440 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.834510 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.873580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"message\\\":\\\"ntroller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0317 01:12:01.926172 7046 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"8b82f026-5975-4a1b-bb18-08d5d51147ec\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.889379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.907699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.934310 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:11 crc kubenswrapper[4735]: I0317 01:12:11.950892 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:11Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478535 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478658 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478820 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478848 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478899 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478902 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.478822959 +0000 UTC m=+222.111055977 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.478956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478978 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.478951362 +0000 UTC m=+222.111184380 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.478979 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:12 crc kubenswrapper[4735]: I0317 01:12:12.479057 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479092 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479063 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479222 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479270 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.479242308 +0000 UTC m=+222.111475286 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479091 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479291 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479303 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479328 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.47932136 +0000 UTC m=+222.111554338 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479126 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:12:12 crc kubenswrapper[4735]: E0317 01:12:12.479364 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.479358801 +0000 UTC m=+222.111591779 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 01:12:14 crc kubenswrapper[4735]: I0317 01:12:14.072664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:14 crc kubenswrapper[4735]: E0317 01:12:14.072916 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:14 crc kubenswrapper[4735]: I0317 01:12:14.072704 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:14 crc kubenswrapper[4735]: E0317 01:12:14.073055 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:14 crc kubenswrapper[4735]: I0317 01:12:14.072664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:14 crc kubenswrapper[4735]: E0317 01:12:14.073215 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:14 crc kubenswrapper[4735]: I0317 01:12:14.074528 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:14 crc kubenswrapper[4735]: E0317 01:12:14.074838 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.097255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mm58f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:11:59Z\\\",\\\"message\\\":\\\"2026-03-17T01:11:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94\\\\n2026-03-17T01:11:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1fee608f-a0a3-4697-9270-6df42dacae94 to /host/opt/cni/bin/\\\\n2026-03-17T01:11:14Z [verbose] multus-daemon started\\\\n2026-03-17T01:11:14Z [verbose] Readiness Indicator file check\\\\n2026-03-17T01:11:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lqtql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mm58f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.118619 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb70d4f7-24ef-40bc-85e0-6b3aba532ebd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a8332dbfb1555aad08f74de3fd8d3ce4bc6fe8a386576a0a3383c06ba44c64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12dc95aa9531891349369f0ec1f96283745934cf8364db5ac2d2aa6e21bea258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.141158 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4568151225bb3727279e1c3b915bfff0c4b50dcb1dcbae64229f78756db15493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf63581bc82169bb1a35fa4cb346f5a450786a1f94c4b165612be2bf4067323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.162034 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: E0317 01:12:15.164891 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.187561 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2s9p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d065a5d-f163-4cbd-8790-023a32481e7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590eae056e964d807386375bc96cc2f4a281c7a90b528586be822c1cdaa72df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g86g4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2s9p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.205077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb65331968c75b62c643a2de1ce5ad9e3ad763a513f52af826b897a2f600097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.220455 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6577add2-4376-4c19-b025-31caf2138e36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3baf3318dd79ba4f737b01c48ef838a4ad4878c6c354ada6f68466c0361263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b6cc1ed08ee31cfeea5238e08856f46ecd32da5461f2087e972b58f7d1bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfmlm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fl2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.245124 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 01:10:33.670781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 01:10:33.670928 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 01:10:33.671531 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2937417545/tls.crt::/tmp/serving-cert-2937417545/tls.key\\\\\\\"\\\\nI0317 01:10:33.949237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 01:10:33.953548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 01:10:33.953594 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 01:10:33.953634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 01:10:33.953644 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 01:10:33.961930 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 01:10:33.961956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 01:10:33.961968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 01:10:33.961972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 01:10:33.961977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 01:10:33.961981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0317 01:10:33.962148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0317 01:10:33.967790 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:10:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.265372 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77ae6317-28e3-4038-9866-0111e4daba44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d608fcec166f00611dce769ccb65a662685d31d839c10c2ba39ca85d4ff8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://584b42f0d6bfe8d40cec65bc50f535a1ef6c56aa975d0890f206e6850a270af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f65886df3193f82ed337a564ad8a39ab8022dc0b09140a46a4dfb6ee7df5d3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bcaf220b1cffe004323024b1aa31f48588405b2be447dc924aa309918f9bb87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:09:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.288196 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.310736 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.335212 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26e9b32a-eeef-45ee-8107-2a94625bbf6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c6b523a7627bd613ead5539228220b51d31e6af0e633522e846f79452e67468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c831d24a0a57618c4c8f118815aeac03b76731b28aeb597122babb39df71a30c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f049821ab371d9362325fa8ed853d50350f2acf2e5617d5bc29677e1720b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ae3d8a847b2ef15dc59f0679199e416ae6cdb6c5d6890da13cc3fa068fcc713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e08816e4b26d914b8cbdc092d24ac5d697187b9a98a23e8af061955470a664\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d0b6f78261315a35d28ca1390d1193b79126adf423f8e37180d69bbeecdbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410a2e54b41b30c8d33acf44077fddd689233675983465b4dd94486bec6c2831\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztfcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9pv7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.353283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a72fe2c-32fb-4360-882b-44debb825c9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdzxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dkwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.369185 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4rx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"434a2452-dc92-41c5-9236-02bd0d70b401\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48bd7571d48451bbe8c6dc4e4fc6efd79aff12d8aa5e2694c4d1b5fdc729ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnwtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4rx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.395015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0732b86f-f277-47e8-ad77-e8a4e145f195\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fdbb86fbe1fed3554c79ef74caf68b972e207894ddc709e00864ba7654a2d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ffe16eb8eccde2c4c0ae32eeb624e2ac4464570af2dcff507ea2e6dc1d4bd95\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T01:10:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 01:09:37.258666 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 01:09:37.262942 1 observer_polling.go:159] Starting file observer\\\\nI0317 01:09:37.295972 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 01:09:37.300431 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0317 01:10:07.516733 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:10:06Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c48581a371876d42f44f3a9d5fc767bb68d91070b8681fad810206d74db356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d1365387c930c5e0c4fd4aca02347b7c191a400a6f912bc0f4b9b4509e8e63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:09:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:09:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.417246 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f66462b5560384ab25f550268716726f17a0ab1bd4cc727338fd8f2b01a970dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.438386 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe43c53-58a2-4450-a71c-667e10384678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75afe4b7054c216667494d05eba380c31f69a1a66807b18462b95f0ab2546bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97qmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z669m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:15 crc kubenswrapper[4735]: I0317 01:12:15.469654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d25c473-740d-4af9-b5f7-72bfc5d911a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T01:12:01Z\\\",\\\"message\\\":\\\"ntroller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0317 01:12:01.926172 7046 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"8b82f026-5975-4a1b-bb18-08d5d51147ec\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T01:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T01:11:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T01:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhbd4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T01:11:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5mhq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:15Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:16 crc kubenswrapper[4735]: I0317 01:12:16.072525 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:16 crc kubenswrapper[4735]: I0317 01:12:16.072542 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:16 crc kubenswrapper[4735]: I0317 01:12:16.073025 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:16 crc kubenswrapper[4735]: I0317 01:12:16.073257 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:16 crc kubenswrapper[4735]: E0317 01:12:16.099838 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:16 crc kubenswrapper[4735]: E0317 01:12:16.100265 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:16 crc kubenswrapper[4735]: E0317 01:12:16.100425 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:16 crc kubenswrapper[4735]: E0317 01:12:16.100543 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.668643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.669096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.669115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.669146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.669163 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:17Z","lastTransitionTime":"2026-03-17T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.690767 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.696834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.696932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.696953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.696982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.697000 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:17Z","lastTransitionTime":"2026-03-17T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.718748 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.724188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.724282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.724331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.724357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.724375 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:17Z","lastTransitionTime":"2026-03-17T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.748277 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.752694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.752757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.752774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.752801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.752818 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:17Z","lastTransitionTime":"2026-03-17T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.785551 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.791915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.791985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.792007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.792041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:17 crc kubenswrapper[4735]: I0317 01:12:17.792064 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:17Z","lastTransitionTime":"2026-03-17T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.813121 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44ff1d0f-f4fc-4cb2-a632-dc715bf02555\\\",\\\"systemUUID\\\":\\\"4bab9c2b-e779-4cf2-8464-6eca29daaf6c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T01:12:17Z is after 2025-08-24T17:21:41Z" Mar 17 01:12:17 crc kubenswrapper[4735]: E0317 01:12:17.813504 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 01:12:18 crc kubenswrapper[4735]: I0317 01:12:18.072293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:18 crc kubenswrapper[4735]: I0317 01:12:18.072344 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:18 crc kubenswrapper[4735]: I0317 01:12:18.072370 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:18 crc kubenswrapper[4735]: I0317 01:12:18.072356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:18 crc kubenswrapper[4735]: E0317 01:12:18.072496 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:18 crc kubenswrapper[4735]: E0317 01:12:18.072694 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:18 crc kubenswrapper[4735]: E0317 01:12:18.072836 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:18 crc kubenswrapper[4735]: E0317 01:12:18.073008 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:20 crc kubenswrapper[4735]: I0317 01:12:20.072219 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:20 crc kubenswrapper[4735]: I0317 01:12:20.072313 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:20 crc kubenswrapper[4735]: I0317 01:12:20.072311 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:20 crc kubenswrapper[4735]: I0317 01:12:20.072259 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:20 crc kubenswrapper[4735]: E0317 01:12:20.072437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:20 crc kubenswrapper[4735]: E0317 01:12:20.072852 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:20 crc kubenswrapper[4735]: E0317 01:12:20.072994 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:20 crc kubenswrapper[4735]: E0317 01:12:20.073089 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:20 crc kubenswrapper[4735]: E0317 01:12:20.167025 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:22 crc kubenswrapper[4735]: I0317 01:12:22.072539 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:22 crc kubenswrapper[4735]: I0317 01:12:22.072616 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:22 crc kubenswrapper[4735]: I0317 01:12:22.072627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:22 crc kubenswrapper[4735]: I0317 01:12:22.072578 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:22 crc kubenswrapper[4735]: E0317 01:12:22.072766 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:22 crc kubenswrapper[4735]: E0317 01:12:22.072969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:22 crc kubenswrapper[4735]: E0317 01:12:22.073184 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:22 crc kubenswrapper[4735]: E0317 01:12:22.073298 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:24 crc kubenswrapper[4735]: I0317 01:12:24.072759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:24 crc kubenswrapper[4735]: E0317 01:12:24.073050 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:24 crc kubenswrapper[4735]: I0317 01:12:24.073109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:24 crc kubenswrapper[4735]: I0317 01:12:24.073127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:24 crc kubenswrapper[4735]: I0317 01:12:24.073216 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:24 crc kubenswrapper[4735]: E0317 01:12:24.073379 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:24 crc kubenswrapper[4735]: E0317 01:12:24.073827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:24 crc kubenswrapper[4735]: E0317 01:12:24.074169 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:24 crc kubenswrapper[4735]: I0317 01:12:24.098487 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.076417 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:12:25 crc kubenswrapper[4735]: E0317 01:12:25.076702 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.141120 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=45.141053318 podStartE2EDuration="45.141053318s" podCreationTimestamp="2026-03-17 01:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.115723544 +0000 UTC m=+170.747956552" watchObservedRunningTime="2026-03-17 01:12:25.141053318 +0000 UTC m=+170.773286336" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.162830 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podStartSLOduration=105.162807147 podStartE2EDuration="1m45.162807147s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.162253714 +0000 UTC m=+170.794486732" watchObservedRunningTime="2026-03-17 01:12:25.162807147 +0000 UTC m=+170.795040165" Mar 17 01:12:25 crc kubenswrapper[4735]: E0317 01:12:25.167936 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.246834 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.246806408 podStartE2EDuration="19.246806408s" podCreationTimestamp="2026-03-17 01:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.246686726 +0000 UTC m=+170.878919714" watchObservedRunningTime="2026-03-17 01:12:25.246806408 +0000 UTC m=+170.879039426" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.247883 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gt4rx" podStartSLOduration=105.247845313 podStartE2EDuration="1m45.247845313s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.232739813 +0000 UTC m=+170.864972831" watchObservedRunningTime="2026-03-17 01:12:25.247845313 +0000 UTC m=+170.880078331" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.302908 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2s9p2" podStartSLOduration=105.302852194 podStartE2EDuration="1m45.302852194s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.302027824 +0000 UTC m=+170.934260812" watchObservedRunningTime="2026-03-17 01:12:25.302852194 +0000 UTC m=+170.935085212" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.325398 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mm58f" podStartSLOduration=105.32537254 podStartE2EDuration="1m45.32537254s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.324219983 +0000 UTC m=+170.956453001" watchObservedRunningTime="2026-03-17 01:12:25.32537254 +0000 UTC m=+170.957605558" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.402471 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.402442137 podStartE2EDuration="1.402442137s" podCreationTimestamp="2026-03-17 01:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.402356025 +0000 UTC m=+171.034589013" watchObservedRunningTime="2026-03-17 01:12:25.402442137 +0000 UTC m=+171.034675145" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.403389 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fl2dw" podStartSLOduration=104.403376599 podStartE2EDuration="1m44.403376599s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.356715067 +0000 UTC m=+170.988948055" watchObservedRunningTime="2026-03-17 01:12:25.403376599 +0000 UTC m=+171.035609607" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.428323 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.428302073 podStartE2EDuration="1m8.428302073s" podCreationTimestamp="2026-03-17 01:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.426652563 +0000 UTC m=+171.058885551" watchObservedRunningTime="2026-03-17 01:12:25.428302073 +0000 UTC m=+171.060535061" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.444713 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.444688723 podStartE2EDuration="48.444688723s" podCreationTimestamp="2026-03-17 01:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.443894035 +0000 UTC m=+171.076127023" watchObservedRunningTime="2026-03-17 01:12:25.444688723 +0000 UTC m=+171.076921731" Mar 17 01:12:25 crc kubenswrapper[4735]: I0317 01:12:25.516466 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9pv7f" podStartSLOduration=105.516443144 podStartE2EDuration="1m45.516443144s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:25.515387868 +0000 UTC m=+171.147620856" watchObservedRunningTime="2026-03-17 01:12:25.516443144 +0000 UTC m=+171.148676132" Mar 17 01:12:26 crc kubenswrapper[4735]: I0317 01:12:26.073128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:26 crc kubenswrapper[4735]: I0317 01:12:26.073162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:26 crc kubenswrapper[4735]: I0317 01:12:26.073312 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:26 crc kubenswrapper[4735]: E0317 01:12:26.073563 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:26 crc kubenswrapper[4735]: E0317 01:12:26.073740 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:26 crc kubenswrapper[4735]: E0317 01:12:26.073932 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:26 crc kubenswrapper[4735]: I0317 01:12:26.074013 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:26 crc kubenswrapper[4735]: E0317 01:12:26.074188 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.072080 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.072144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.072144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.072296 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:28 crc kubenswrapper[4735]: E0317 01:12:28.072402 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:28 crc kubenswrapper[4735]: E0317 01:12:28.072533 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:28 crc kubenswrapper[4735]: E0317 01:12:28.072671 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:28 crc kubenswrapper[4735]: E0317 01:12:28.072922 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.126594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.126670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.126694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.126727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.126757 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T01:12:28Z","lastTransitionTime":"2026-03-17T01:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.202134 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp"] Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.202755 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.205237 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.205736 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.206331 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.206714 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.383132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.383748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e25d3ae-a6fa-459f-ac0c-267463384f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.384023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.384294 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e25d3ae-a6fa-459f-ac0c-267463384f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.384483 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e25d3ae-a6fa-459f-ac0c-267463384f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.486348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e25d3ae-a6fa-459f-ac0c-267463384f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.487582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e25d3ae-a6fa-459f-ac0c-267463384f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.487679 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.488101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.487849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.488542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e25d3ae-a6fa-459f-ac0c-267463384f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.488750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.488977 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e25d3ae-a6fa-459f-ac0c-267463384f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.490225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e25d3ae-a6fa-459f-ac0c-267463384f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.503279 4735 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.508078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e25d3ae-a6fa-459f-ac0c-267463384f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.520469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e25d3ae-a6fa-459f-ac0c-267463384f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkjwp\" (UID: \"8e25d3ae-a6fa-459f-ac0c-267463384f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:28 crc kubenswrapper[4735]: I0317 01:12:28.525788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" Mar 17 01:12:29 crc kubenswrapper[4735]: I0317 01:12:29.553697 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" event={"ID":"8e25d3ae-a6fa-459f-ac0c-267463384f4a","Type":"ContainerStarted","Data":"5b954401a2564ad91c0a1c9fba4b02edbc25af9a98846d1274258fac41598e87"} Mar 17 01:12:29 crc kubenswrapper[4735]: I0317 01:12:29.554169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" event={"ID":"8e25d3ae-a6fa-459f-ac0c-267463384f4a","Type":"ContainerStarted","Data":"7fbad493df21d15eb618a1845eff7046e0e1964dc7f6f85936ef583a638ab21c"} Mar 17 01:12:29 crc kubenswrapper[4735]: I0317 01:12:29.575779 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkjwp" podStartSLOduration=109.57574257 podStartE2EDuration="1m49.57574257s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:29.573715241 +0000 UTC m=+175.205948249" watchObservedRunningTime="2026-03-17 01:12:29.57574257 +0000 UTC m=+175.207975588" Mar 17 01:12:30 crc kubenswrapper[4735]: I0317 01:12:30.073051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:30 crc kubenswrapper[4735]: I0317 01:12:30.073159 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:30 crc kubenswrapper[4735]: I0317 01:12:30.073254 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.073263 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.073418 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.073610 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:30 crc kubenswrapper[4735]: I0317 01:12:30.073853 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.074219 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:30 crc kubenswrapper[4735]: I0317 01:12:30.110992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.111233 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.111303 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs podName:3a72fe2c-32fb-4360-882b-44debb825c9e nodeName:}" failed. No retries permitted until 2026-03-17 01:13:34.11128059 +0000 UTC m=+239.743513578 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs") pod "network-metrics-daemon-dkwf5" (UID: "3a72fe2c-32fb-4360-882b-44debb825c9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 01:12:30 crc kubenswrapper[4735]: E0317 01:12:30.169730 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:32 crc kubenswrapper[4735]: I0317 01:12:32.072379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:32 crc kubenswrapper[4735]: I0317 01:12:32.072444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:32 crc kubenswrapper[4735]: I0317 01:12:32.072488 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:32 crc kubenswrapper[4735]: I0317 01:12:32.072382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:32 crc kubenswrapper[4735]: E0317 01:12:32.072613 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:32 crc kubenswrapper[4735]: E0317 01:12:32.072795 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:32 crc kubenswrapper[4735]: E0317 01:12:32.072969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:32 crc kubenswrapper[4735]: E0317 01:12:32.073047 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:34 crc kubenswrapper[4735]: I0317 01:12:34.072900 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:34 crc kubenswrapper[4735]: I0317 01:12:34.072958 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:34 crc kubenswrapper[4735]: I0317 01:12:34.072975 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:34 crc kubenswrapper[4735]: I0317 01:12:34.072935 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:34 crc kubenswrapper[4735]: E0317 01:12:34.073126 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:34 crc kubenswrapper[4735]: E0317 01:12:34.073478 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:34 crc kubenswrapper[4735]: E0317 01:12:34.073621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:34 crc kubenswrapper[4735]: E0317 01:12:34.073747 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:35 crc kubenswrapper[4735]: E0317 01:12:35.170456 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:36 crc kubenswrapper[4735]: I0317 01:12:36.072123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:36 crc kubenswrapper[4735]: I0317 01:12:36.072167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:36 crc kubenswrapper[4735]: I0317 01:12:36.072219 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:36 crc kubenswrapper[4735]: E0317 01:12:36.072269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:36 crc kubenswrapper[4735]: I0317 01:12:36.072139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:36 crc kubenswrapper[4735]: E0317 01:12:36.072795 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:36 crc kubenswrapper[4735]: E0317 01:12:36.073024 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:36 crc kubenswrapper[4735]: E0317 01:12:36.073379 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:36 crc kubenswrapper[4735]: I0317 01:12:36.073444 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:12:36 crc kubenswrapper[4735]: E0317 01:12:36.073682 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5mhq_openshift-ovn-kubernetes(5d25c473-740d-4af9-b5f7-72bfc5d911a4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" Mar 17 01:12:38 crc kubenswrapper[4735]: I0317 01:12:38.072409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:38 crc kubenswrapper[4735]: I0317 01:12:38.072493 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:38 crc kubenswrapper[4735]: I0317 01:12:38.072519 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:38 crc kubenswrapper[4735]: I0317 01:12:38.072443 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:38 crc kubenswrapper[4735]: E0317 01:12:38.072648 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:38 crc kubenswrapper[4735]: E0317 01:12:38.072726 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:38 crc kubenswrapper[4735]: E0317 01:12:38.072988 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:38 crc kubenswrapper[4735]: E0317 01:12:38.073135 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:40 crc kubenswrapper[4735]: I0317 01:12:40.072793 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:40 crc kubenswrapper[4735]: I0317 01:12:40.073084 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:40 crc kubenswrapper[4735]: I0317 01:12:40.073139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:40 crc kubenswrapper[4735]: I0317 01:12:40.073251 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:40 crc kubenswrapper[4735]: E0317 01:12:40.073441 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:40 crc kubenswrapper[4735]: E0317 01:12:40.073787 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:40 crc kubenswrapper[4735]: E0317 01:12:40.074274 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:40 crc kubenswrapper[4735]: E0317 01:12:40.074441 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:40 crc kubenswrapper[4735]: E0317 01:12:40.172442 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:42 crc kubenswrapper[4735]: I0317 01:12:42.072726 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:42 crc kubenswrapper[4735]: I0317 01:12:42.072837 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:42 crc kubenswrapper[4735]: E0317 01:12:42.073319 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:42 crc kubenswrapper[4735]: I0317 01:12:42.072986 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:42 crc kubenswrapper[4735]: I0317 01:12:42.072944 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:42 crc kubenswrapper[4735]: E0317 01:12:42.073532 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:42 crc kubenswrapper[4735]: E0317 01:12:42.073734 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:42 crc kubenswrapper[4735]: E0317 01:12:42.073963 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:44 crc kubenswrapper[4735]: I0317 01:12:44.072506 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:44 crc kubenswrapper[4735]: I0317 01:12:44.072618 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:44 crc kubenswrapper[4735]: I0317 01:12:44.072618 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:44 crc kubenswrapper[4735]: I0317 01:12:44.072760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:44 crc kubenswrapper[4735]: E0317 01:12:44.072755 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:44 crc kubenswrapper[4735]: E0317 01:12:44.072965 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:44 crc kubenswrapper[4735]: E0317 01:12:44.073113 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:44 crc kubenswrapper[4735]: E0317 01:12:44.073195 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:45 crc kubenswrapper[4735]: E0317 01:12:45.173624 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.072105 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:46 crc kubenswrapper[4735]: E0317 01:12:46.072323 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.072584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:46 crc kubenswrapper[4735]: E0317 01:12:46.072671 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.072850 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:46 crc kubenswrapper[4735]: E0317 01:12:46.072987 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.073139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:46 crc kubenswrapper[4735]: E0317 01:12:46.073366 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.624919 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/1.log" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.626335 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/0.log" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.626432 4735 generic.go:334] "Generic (PLEG): container finished" podID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" containerID="e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1" exitCode=1 Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.626486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerDied","Data":"e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1"} Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.626551 4735 scope.go:117] "RemoveContainer" containerID="c5bdc59050610c1f166c9fdcb9c4b41bb028559537ff4f162339b3225ca09416" Mar 17 01:12:46 crc kubenswrapper[4735]: I0317 01:12:46.627374 4735 scope.go:117] "RemoveContainer" containerID="e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1" Mar 17 01:12:46 crc kubenswrapper[4735]: E0317 01:12:46.628085 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mm58f_openshift-multus(a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d)\"" pod="openshift-multus/multus-mm58f" podUID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" Mar 17 01:12:47 crc kubenswrapper[4735]: I0317 01:12:47.634052 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/1.log" Mar 17 01:12:48 crc kubenswrapper[4735]: I0317 01:12:48.072836 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:48 crc kubenswrapper[4735]: E0317 01:12:48.072992 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:48 crc kubenswrapper[4735]: I0317 01:12:48.073111 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:48 crc kubenswrapper[4735]: I0317 01:12:48.073410 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:48 crc kubenswrapper[4735]: E0317 01:12:48.073678 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:48 crc kubenswrapper[4735]: E0317 01:12:48.073974 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:48 crc kubenswrapper[4735]: I0317 01:12:48.074173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:48 crc kubenswrapper[4735]: E0317 01:12:48.074322 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:49 crc kubenswrapper[4735]: I0317 01:12:49.074065 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:12:49 crc kubenswrapper[4735]: I0317 01:12:49.642980 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/3.log" Mar 17 01:12:49 crc kubenswrapper[4735]: I0317 01:12:49.646773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerStarted","Data":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} Mar 17 01:12:49 crc kubenswrapper[4735]: I0317 01:12:49.647500 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:12:49 crc kubenswrapper[4735]: I0317 01:12:49.671507 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podStartSLOduration=129.671486266 podStartE2EDuration="2m9.671486266s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:12:49.669832262 +0000 UTC m=+195.302065250" watchObservedRunningTime="2026-03-17 01:12:49.671486266 +0000 UTC m=+195.303719254" Mar 17 01:12:50 crc kubenswrapper[4735]: I0317 01:12:50.032296 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dkwf5"] Mar 17 01:12:50 crc kubenswrapper[4735]: I0317 01:12:50.032410 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:50 crc kubenswrapper[4735]: E0317 01:12:50.032514 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:50 crc kubenswrapper[4735]: I0317 01:12:50.072924 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:50 crc kubenswrapper[4735]: I0317 01:12:50.072983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:50 crc kubenswrapper[4735]: I0317 01:12:50.072951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:50 crc kubenswrapper[4735]: E0317 01:12:50.073204 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:50 crc kubenswrapper[4735]: E0317 01:12:50.073430 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:50 crc kubenswrapper[4735]: E0317 01:12:50.073625 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:50 crc kubenswrapper[4735]: E0317 01:12:50.175474 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:52 crc kubenswrapper[4735]: I0317 01:12:52.072555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:52 crc kubenswrapper[4735]: I0317 01:12:52.072619 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:52 crc kubenswrapper[4735]: I0317 01:12:52.072679 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:52 crc kubenswrapper[4735]: I0317 01:12:52.072698 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:52 crc kubenswrapper[4735]: E0317 01:12:52.072843 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:52 crc kubenswrapper[4735]: E0317 01:12:52.073019 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:52 crc kubenswrapper[4735]: E0317 01:12:52.073476 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:52 crc kubenswrapper[4735]: E0317 01:12:52.073690 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:54 crc kubenswrapper[4735]: I0317 01:12:54.072361 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:54 crc kubenswrapper[4735]: I0317 01:12:54.072498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:54 crc kubenswrapper[4735]: I0317 01:12:54.072390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:54 crc kubenswrapper[4735]: E0317 01:12:54.072581 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:54 crc kubenswrapper[4735]: E0317 01:12:54.072761 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:54 crc kubenswrapper[4735]: I0317 01:12:54.072774 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:54 crc kubenswrapper[4735]: E0317 01:12:54.073096 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:54 crc kubenswrapper[4735]: E0317 01:12:54.072998 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:55 crc kubenswrapper[4735]: E0317 01:12:55.176219 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:12:56 crc kubenswrapper[4735]: I0317 01:12:56.072121 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:56 crc kubenswrapper[4735]: I0317 01:12:56.072174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:56 crc kubenswrapper[4735]: I0317 01:12:56.072150 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:56 crc kubenswrapper[4735]: I0317 01:12:56.072268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:56 crc kubenswrapper[4735]: E0317 01:12:56.072455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:56 crc kubenswrapper[4735]: E0317 01:12:56.072588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:56 crc kubenswrapper[4735]: E0317 01:12:56.072738 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:56 crc kubenswrapper[4735]: E0317 01:12:56.072940 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:58 crc kubenswrapper[4735]: I0317 01:12:58.072348 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:12:58 crc kubenswrapper[4735]: E0317 01:12:58.072526 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:12:58 crc kubenswrapper[4735]: I0317 01:12:58.072375 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:12:58 crc kubenswrapper[4735]: E0317 01:12:58.072635 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:12:58 crc kubenswrapper[4735]: I0317 01:12:58.072372 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:12:58 crc kubenswrapper[4735]: E0317 01:12:58.072727 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:12:58 crc kubenswrapper[4735]: I0317 01:12:58.072348 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:12:58 crc kubenswrapper[4735]: E0317 01:12:58.072823 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:12:59 crc kubenswrapper[4735]: I0317 01:12:59.073521 4735 scope.go:117] "RemoveContainer" containerID="e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1" Mar 17 01:12:59 crc kubenswrapper[4735]: I0317 01:12:59.686565 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/1.log" Mar 17 01:12:59 crc kubenswrapper[4735]: I0317 01:12:59.686620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerStarted","Data":"98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf"} Mar 17 01:13:00 crc kubenswrapper[4735]: I0317 01:13:00.072511 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:00 crc kubenswrapper[4735]: I0317 01:13:00.072530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:00 crc kubenswrapper[4735]: E0317 01:13:00.073177 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:13:00 crc kubenswrapper[4735]: I0317 01:13:00.072830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:00 crc kubenswrapper[4735]: E0317 01:13:00.073200 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:13:00 crc kubenswrapper[4735]: I0317 01:13:00.072692 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:00 crc kubenswrapper[4735]: E0317 01:13:00.073598 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:13:00 crc kubenswrapper[4735]: E0317 01:13:00.073849 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:13:00 crc kubenswrapper[4735]: E0317 01:13:00.178175 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 01:13:02 crc kubenswrapper[4735]: I0317 01:13:02.072812 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:02 crc kubenswrapper[4735]: I0317 01:13:02.072903 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:02 crc kubenswrapper[4735]: E0317 01:13:02.072991 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:13:02 crc kubenswrapper[4735]: I0317 01:13:02.072846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:02 crc kubenswrapper[4735]: E0317 01:13:02.073126 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:13:02 crc kubenswrapper[4735]: I0317 01:13:02.073171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:02 crc kubenswrapper[4735]: E0317 01:13:02.073219 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:13:02 crc kubenswrapper[4735]: E0317 01:13:02.073256 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:13:04 crc kubenswrapper[4735]: I0317 01:13:04.072968 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:04 crc kubenswrapper[4735]: I0317 01:13:04.073020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:04 crc kubenswrapper[4735]: I0317 01:13:04.073059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:04 crc kubenswrapper[4735]: I0317 01:13:04.073163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:04 crc kubenswrapper[4735]: E0317 01:13:04.073156 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 01:13:04 crc kubenswrapper[4735]: E0317 01:13:04.073380 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 01:13:04 crc kubenswrapper[4735]: E0317 01:13:04.073537 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkwf5" podUID="3a72fe2c-32fb-4360-882b-44debb825c9e" Mar 17 01:13:04 crc kubenswrapper[4735]: E0317 01:13:04.073645 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.072601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.072655 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.072700 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.072716 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.132589 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.132694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.133203 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.133466 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.133520 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 01:13:06 crc kubenswrapper[4735]: I0317 01:13:06.133964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 01:13:08 crc kubenswrapper[4735]: I0317 01:13:08.998541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.052410 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.052953 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.058331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.059015 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.059237 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.066098 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.066757 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.067108 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqdp7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.067637 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.084322 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j2xfj"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.085101 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.085187 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.085293 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.085431 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086049 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086088 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086427 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086730 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p697v"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.086753 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.087143 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.107884 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.110111 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.114179 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.114370 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.114705 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2jwr2"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.114919 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.115169 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.127594 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.114384 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.122732 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.122795 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.128495 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.129617 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.130361 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lsrzv"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.130440 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.130750 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.131055 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr62z"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.131431 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.131697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.131914 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.137493 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.137929 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.138201 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.138456 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.138597 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.138844 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139026 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139154 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139244 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139332 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139419 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139608 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.139969 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.141171 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.141338 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.141461 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.141578 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.141697 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.142550 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.142748 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.143390 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.145010 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.145169 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.145408 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.145536 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.145665 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.146084 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.146162 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.146283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.146535 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.147299 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.148577 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.150809 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.154939 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.156566 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.156610 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.156894 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.167928 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qb49c"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.168790 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.170266 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8rdqg"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.170869 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.171296 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.171734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.171760 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.172570 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qcs85"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.173163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.173577 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.173965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.174273 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.174555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.174778 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.175124 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.175398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.186022 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.189047 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.194245 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.194485 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195114 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195280 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195404 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195537 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.195887 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.196102 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.196275 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.196389 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.196521 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.196894 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.197157 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.200594 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.200803 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201219 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201370 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201466 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201501 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201568 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.201615 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.202036 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211096 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211238 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211320 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211575 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211771 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.211998 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.212055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.212203 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.213433 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fce79315-fdfb-4384-b901-053a3482b27b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.217119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219308 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219446 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219612 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219709 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219800 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219826 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219801 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1d16522-3b99-4ee4-b1ae-901b135c661d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46967c4e-e4ac-403f-93b0-c2315a9a067f-machine-approver-tls\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219913 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-config\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219979 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219988 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.219993 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-service-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgk2m\" (UniqueName: \"kubernetes.io/projected/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-kube-api-access-sgk2m\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220076 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220085 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fce79315-fdfb-4384-b901-053a3482b27b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r757\" (UniqueName: \"kubernetes.io/projected/d1d16522-3b99-4ee4-b1ae-901b135c661d-kube-api-access-8r757\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220119 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7dp\" (UniqueName: \"kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-config\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-auth-proxy-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2sg\" (UniqueName: \"kubernetes.io/projected/772d764a-7cf4-443a-93b6-a84726c59355-kube-api-access-2x2sg\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cx2c\" (UniqueName: \"kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772d764a-7cf4-443a-93b6-a84726c59355-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220255 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-images\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772d764a-7cf4-443a-93b6-a84726c59355-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220358 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-trusted-ca\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4f68\" (UniqueName: \"kubernetes.io/projected/46967c4e-e4ac-403f-93b0-c2315a9a067f-kube-api-access-n4f68\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-serving-cert\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220409 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa8fe8-3078-4fcb-a84a-976b38223c48-serving-cert\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6sgw\" (UniqueName: \"kubernetes.io/projected/5afa8fe8-3078-4fcb-a84a-976b38223c48-kube-api-access-c6sgw\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220480 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220497 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56zvz\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-kube-api-access-56zvz\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220515 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-config\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qvqn\" (UniqueName: \"kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.220888 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.221208 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.222009 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.222137 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.222386 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.222459 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqdp7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.222849 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j2xfj"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.223651 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.224734 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.230714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.231751 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.233145 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.233266 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.233524 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.234554 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.235025 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.235627 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-s2dvf"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.237092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.237713 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.237735 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.238069 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.238288 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.238487 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.238718 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.239515 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.240797 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.244922 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.245411 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.250842 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.254901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sgbsk"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.257017 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwkhd"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.258588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.260362 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.260123 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.274260 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.275467 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdv79"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.276575 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.280198 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.281071 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.284027 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.284689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.295013 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.296534 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.296782 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.301761 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.306051 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.306519 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.306715 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.308001 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.316923 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.317191 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.317530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.318054 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561832-p6vx6"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.318664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.319281 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.320473 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772d764a-7cf4-443a-93b6-a84726c59355-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-config\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321724 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321765 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-trusted-ca\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321825 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-service-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4f68\" (UniqueName: \"kubernetes.io/projected/46967c4e-e4ac-403f-93b0-c2315a9a067f-kube-api-access-n4f68\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321921 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-serving-cert\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-serving-cert\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e1db38b-1243-4309-8a2e-961697fc030a-metrics-tls\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.321985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa8fe8-3078-4fcb-a84a-976b38223c48-serving-cert\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-dir\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44b9l\" (UniqueName: \"kubernetes.io/projected/8839a674-43ea-4a76-a2c5-b56261bac28d-kube-api-access-44b9l\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6sgw\" (UniqueName: \"kubernetes.io/projected/5afa8fe8-3078-4fcb-a84a-976b38223c48-kube-api-access-c6sgw\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322091 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56zvz\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-kube-api-access-56zvz\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322124 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322180 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz52z\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-kube-api-access-gz52z\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322241 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96e8ca98-69e8-409a-acc1-85ee705dfede-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322329 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb41bde3-7791-4ac9-bda5-8c094ab6c904-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322345 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405c316c-f04d-4e00-9009-35dbce4a94c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-config\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qvqn\" (UniqueName: \"kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c316c-f04d-4e00-9009-35dbce4a94c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-encryption-config\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fce79315-fdfb-4384-b901-053a3482b27b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772d764a-7cf4-443a-93b6-a84726c59355-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322463 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb41bde3-7791-4ac9-bda5-8c094ab6c904-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4w2\" (UniqueName: \"kubernetes.io/projected/60ccc3db-c677-47ff-8113-5eb103809a4a-kube-api-access-gc4w2\") pod \"downloads-7954f5f757-lsrzv\" (UID: \"60ccc3db-c677-47ff-8113-5eb103809a4a\") " pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322640 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1d16522-3b99-4ee4-b1ae-901b135c661d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322658 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46967c4e-e4ac-403f-93b0-c2315a9a067f-machine-approver-tls\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.322995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-config\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1868585d-68a4-4a7f-85c2-fd1c4eef0532-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323084 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-client\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-service-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-serving-cert\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-policies\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323264 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgk2m\" (UniqueName: \"kubernetes.io/projected/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-kube-api-access-sgk2m\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323479 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-client\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323574 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323596 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7dp\" (UniqueName: \"kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fce79315-fdfb-4384-b901-053a3482b27b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r757\" (UniqueName: \"kubernetes.io/projected/d1d16522-3b99-4ee4-b1ae-901b135c661d-kube-api-access-8r757\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323658 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27tt\" (UniqueName: \"kubernetes.io/projected/96e8ca98-69e8-409a-acc1-85ee705dfede-kube-api-access-t27tt\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323704 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-config\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323750 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1868585d-68a4-4a7f-85c2-fd1c4eef0532-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323768 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9mn\" (UniqueName: \"kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p8d\" (UniqueName: \"kubernetes.io/projected/8e1db38b-1243-4309-8a2e-961697fc030a-kube-api-access-99p8d\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323804 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-auth-proxy-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mbrw\" (UniqueName: \"kubernetes.io/projected/eb41bde3-7791-4ac9-bda5-8c094ab6c904-kube-api-access-9mbrw\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhst\" (UniqueName: \"kubernetes.io/projected/83a10c7e-ff3c-41b6-adb7-14243953ed5a-kube-api-access-6mhst\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2sg\" (UniqueName: \"kubernetes.io/projected/772d764a-7cf4-443a-93b6-a84726c59355-kube-api-access-2x2sg\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e8ca98-69e8-409a-acc1-85ee705dfede-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.323986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cx2c\" (UniqueName: \"kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324022 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324063 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c316c-f04d-4e00-9009-35dbce4a94c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772d764a-7cf4-443a-93b6-a84726c59355-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.324162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-images\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.326237 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.327597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.328085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-trusted-ca\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.328652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-images\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.329294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.330873 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d16522-3b99-4ee4-b1ae-901b135c661d-config\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.331306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46967c4e-e4ac-403f-93b0-c2315a9a067f-auth-proxy-config\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.331448 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-config\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.333212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fce79315-fdfb-4384-b901-053a3482b27b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.333637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.334038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-service-ca-bundle\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.335394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.335447 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jjfm7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.336706 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.336796 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.337308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.337826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.338598 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.338730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.339444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.340844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.341549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.341638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.342587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.342670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fce79315-fdfb-4384-b901-053a3482b27b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.343733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa8fe8-3078-4fcb-a84a-976b38223c48-config\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.344412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.345537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1d16522-3b99-4ee4-b1ae-901b135c661d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.347704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46967c4e-e4ac-403f-93b0-c2315a9a067f-machine-approver-tls\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.350026 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lsrzv"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.351845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.352463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-serving-cert\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.353068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2jwr2"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.354233 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.356405 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8rdqg"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.357489 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772d764a-7cf4-443a-93b6-a84726c59355-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.358417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa8fe8-3078-4fcb-a84a-976b38223c48-serving-cert\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.359640 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.361830 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.363833 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.364641 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.365804 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr62z"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.368978 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.375138 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.376709 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.378143 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-r87ml"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.378989 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.380373 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwwj9"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.383198 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdv79"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.383309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.383341 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sgbsk"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.384693 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.385797 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.386994 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qb49c"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.388170 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.389207 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qcs85"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.390355 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.391382 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.392519 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.393501 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-p6vx6"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.394573 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.394903 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.395710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.396778 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.397846 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.398941 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.400740 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwkhd"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.401845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.403024 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tn9pc"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.413141 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.414148 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.415325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.416355 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jjfm7"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.418838 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tn9pc"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.420221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwwj9"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.421205 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78"] Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz52z\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-kube-api-access-gz52z\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96e8ca98-69e8-409a-acc1-85ee705dfede-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb41bde3-7791-4ac9-bda5-8c094ab6c904-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425093 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c316c-f04d-4e00-9009-35dbce4a94c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405c316c-f04d-4e00-9009-35dbce4a94c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-encryption-config\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb41bde3-7791-4ac9-bda5-8c094ab6c904-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425197 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4w2\" (UniqueName: \"kubernetes.io/projected/60ccc3db-c677-47ff-8113-5eb103809a4a-kube-api-access-gc4w2\") pod \"downloads-7954f5f757-lsrzv\" (UID: \"60ccc3db-c677-47ff-8113-5eb103809a4a\") " pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1868585d-68a4-4a7f-85c2-fd1c4eef0532-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425312 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-client\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425363 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-serving-cert\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-policies\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-client\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425524 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27tt\" (UniqueName: \"kubernetes.io/projected/96e8ca98-69e8-409a-acc1-85ee705dfede-kube-api-access-t27tt\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425562 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9mn\" (UniqueName: \"kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99p8d\" (UniqueName: \"kubernetes.io/projected/8e1db38b-1243-4309-8a2e-961697fc030a-kube-api-access-99p8d\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1868585d-68a4-4a7f-85c2-fd1c4eef0532-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhst\" (UniqueName: \"kubernetes.io/projected/83a10c7e-ff3c-41b6-adb7-14243953ed5a-kube-api-access-6mhst\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mbrw\" (UniqueName: \"kubernetes.io/projected/eb41bde3-7791-4ac9-bda5-8c094ab6c904-kube-api-access-9mbrw\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e8ca98-69e8-409a-acc1-85ee705dfede-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425777 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c316c-f04d-4e00-9009-35dbce4a94c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425864 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-config\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425884 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425917 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-service-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425954 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-serving-cert\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425970 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e1db38b-1243-4309-8a2e-961697fc030a-metrics-tls\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425996 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-dir\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44b9l\" (UniqueName: \"kubernetes.io/projected/8839a674-43ea-4a76-a2c5-b56261bac28d-kube-api-access-44b9l\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.425565 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/96e8ca98-69e8-409a-acc1-85ee705dfede-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426473 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb41bde3-7791-4ac9-bda5-8c094ab6c904-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.426651 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.427038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.427434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.429032 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-service-ca\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.429305 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.429333 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.429477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-dir\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.429733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.430596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8839a674-43ea-4a76-a2c5-b56261bac28d-config\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.431426 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-client\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.431437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb41bde3-7791-4ac9-bda5-8c094ab6c904-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.431794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.432987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e8ca98-69e8-409a-acc1-85ee705dfede-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.433384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-serving-cert\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.433608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.434302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.434985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e1db38b-1243-4309-8a2e-961697fc030a-metrics-tls\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.435310 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.436553 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.438340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8839a674-43ea-4a76-a2c5-b56261bac28d-etcd-client\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.438436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.438695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.439036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.439260 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-encryption-config\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.440515 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.455261 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.464199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83a10c7e-ff3c-41b6-adb7-14243953ed5a-serving-cert\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.474467 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.494915 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.523642 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.530458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1868585d-68a4-4a7f-85c2-fd1c4eef0532-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.557923 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.574911 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.580658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c316c-f04d-4e00-9009-35dbce4a94c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.594978 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.597166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.616769 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.655006 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.679095 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.689730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a10c7e-ff3c-41b6-adb7-14243953ed5a-audit-policies\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.696090 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.701688 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1868585d-68a4-4a7f-85c2-fd1c4eef0532-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.714957 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.715986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c316c-f04d-4e00-9009-35dbce4a94c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.735984 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.755421 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.775814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.795934 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.815052 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.835238 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.856316 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.876878 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.895909 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.916382 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.935361 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.955891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.975838 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 01:13:09 crc kubenswrapper[4735]: I0317 01:13:09.995067 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.016032 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.036281 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.057510 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.075621 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.095060 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.115634 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.135848 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.156363 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.175700 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.195278 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.215654 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.236084 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.253647 4735 request.go:700] Waited for 1.00238245s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.255690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.275227 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.295439 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.316197 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.336044 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.355303 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.376235 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.396453 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.416154 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.435510 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.455799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.475065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.495667 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.516550 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.535275 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.555885 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.576263 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.595888 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.615267 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.635778 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.656556 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.676342 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.694998 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.715391 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.751305 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.755661 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.775745 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.795980 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.816055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.835545 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.855588 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.875827 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.896029 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.944657 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6sgw\" (UniqueName: \"kubernetes.io/projected/5afa8fe8-3078-4fcb-a84a-976b38223c48-kube-api-access-c6sgw\") pod \"authentication-operator-69f744f599-2jwr2\" (UID: \"5afa8fe8-3078-4fcb-a84a-976b38223c48\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.960955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7dp\" (UniqueName: \"kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp\") pod \"controller-manager-879f6c89f-5jsnf\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:10 crc kubenswrapper[4735]: I0317 01:13:10.977351 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r757\" (UniqueName: \"kubernetes.io/projected/d1d16522-3b99-4ee4-b1ae-901b135c661d-kube-api-access-8r757\") pod \"machine-api-operator-5694c8668f-xqdp7\" (UID: \"d1d16522-3b99-4ee4-b1ae-901b135c661d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.000360 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56zvz\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-kube-api-access-56zvz\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.014409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2sg\" (UniqueName: \"kubernetes.io/projected/772d764a-7cf4-443a-93b6-a84726c59355-kube-api-access-2x2sg\") pod \"openshift-apiserver-operator-796bbdcf4f-rh7dw\" (UID: \"772d764a-7cf4-443a-93b6-a84726c59355\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.022108 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.037892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cx2c\" (UniqueName: \"kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c\") pod \"route-controller-manager-6576b87f9c-hwnhr\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.051109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.057153 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.058115 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qvqn\" (UniqueName: \"kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn\") pod \"console-f9d7485db-nh28b\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.075520 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.096372 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.115540 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.188573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgk2m\" (UniqueName: \"kubernetes.io/projected/a0fdf8be-34e0-4ed0-b462-13b38cecfd73-kube-api-access-sgk2m\") pod \"console-operator-58897d9998-j2xfj\" (UID: \"a0fdf8be-34e0-4ed0-b462-13b38cecfd73\") " pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.195258 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.222111 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.222412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.227352 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fce79315-fdfb-4384-b901-053a3482b27b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-scxl7\" (UID: \"fce79315-fdfb-4384-b901-053a3482b27b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.227491 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.228703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4f68\" (UniqueName: \"kubernetes.io/projected/46967c4e-e4ac-403f-93b0-c2315a9a067f-kube-api-access-n4f68\") pod \"machine-approver-56656f9798-p697v\" (UID: \"46967c4e-e4ac-403f-93b0-c2315a9a067f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.236367 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.243283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.251908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.253888 4735 request.go:700] Waited for 1.870135475s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.256081 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.275424 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.298757 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.303729 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.316831 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.325069 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:11 crc kubenswrapper[4735]: W0317 01:13:11.343461 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46967c4e_e4ac_403f_93b0_c2315a9a067f.slice/crio-0681868ec9c69e72a71290defeb881b20f314baf20680dbf9cc67b1515054561 WatchSource:0}: Error finding container 0681868ec9c69e72a71290defeb881b20f314baf20680dbf9cc67b1515054561: Status 404 returned error can't find the container with id 0681868ec9c69e72a71290defeb881b20f314baf20680dbf9cc67b1515054561 Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.343653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.345153 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.345384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:11 crc kubenswrapper[4735]: W0317 01:13:11.354541 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb55cbc_4698_4a34_aa2d_bead443c0784.slice/crio-22e250f42397c59b47ad967340d7a30ee2e8d444b3ef1c9148e28a2f710ebc27 WatchSource:0}: Error finding container 22e250f42397c59b47ad967340d7a30ee2e8d444b3ef1c9148e28a2f710ebc27: Status 404 returned error can't find the container with id 22e250f42397c59b47ad967340d7a30ee2e8d444b3ef1c9148e28a2f710ebc27 Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.355653 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.399164 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz52z\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-kube-api-access-gz52z\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.407832 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2jwr2"] Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.422059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4w2\" (UniqueName: \"kubernetes.io/projected/60ccc3db-c677-47ff-8113-5eb103809a4a-kube-api-access-gc4w2\") pod \"downloads-7954f5f757-lsrzv\" (UID: \"60ccc3db-c677-47ff-8113-5eb103809a4a\") " pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.432968 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405c316c-f04d-4e00-9009-35dbce4a94c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c56vf\" (UID: \"405c316c-f04d-4e00-9009-35dbce4a94c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.461785 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27tt\" (UniqueName: \"kubernetes.io/projected/96e8ca98-69e8-409a-acc1-85ee705dfede-kube-api-access-t27tt\") pod \"openshift-config-operator-7777fb866f-nr62z\" (UID: \"96e8ca98-69e8-409a-acc1-85ee705dfede\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.474483 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9mn\" (UniqueName: \"kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn\") pod \"oauth-openshift-558db77b4-zgxs6\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.495826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p8d\" (UniqueName: \"kubernetes.io/projected/8e1db38b-1243-4309-8a2e-961697fc030a-kube-api-access-99p8d\") pod \"dns-operator-744455d44c-8rdqg\" (UID: \"8e1db38b-1243-4309-8a2e-961697fc030a\") " pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.512802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhst\" (UniqueName: \"kubernetes.io/projected/83a10c7e-ff3c-41b6-adb7-14243953ed5a-kube-api-access-6mhst\") pod \"apiserver-7bbb656c7d-5grcp\" (UID: \"83a10c7e-ff3c-41b6-adb7-14243953ed5a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.529718 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mbrw\" (UniqueName: \"kubernetes.io/projected/eb41bde3-7791-4ac9-bda5-8c094ab6c904-kube-api-access-9mbrw\") pod \"openshift-controller-manager-operator-756b6f6bc6-j76kq\" (UID: \"eb41bde3-7791-4ac9-bda5-8c094ab6c904\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.548178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.548915 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1868585d-68a4-4a7f-85c2-fd1c4eef0532-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdx94\" (UID: \"1868585d-68a4-4a7f-85c2-fd1c4eef0532\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.572010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44b9l\" (UniqueName: \"kubernetes.io/projected/8839a674-43ea-4a76-a2c5-b56261bac28d-kube-api-access-44b9l\") pod \"etcd-operator-b45778765-qcs85\" (UID: \"8839a674-43ea-4a76-a2c5-b56261bac28d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.612884 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.715991 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j2xfj"] Mar 17 01:13:11 crc kubenswrapper[4735]: W0317 01:13:11.724566 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fdf8be_34e0_4ed0_b462_13b38cecfd73.slice/crio-77c984818f515796d56ac14216e378c11161133c03cfc13346cc0ea3f029667e WatchSource:0}: Error finding container 77c984818f515796d56ac14216e378c11161133c03cfc13346cc0ea3f029667e: Status 404 returned error can't find the container with id 77c984818f515796d56ac14216e378c11161133c03cfc13346cc0ea3f029667e Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.728108 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.728155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7"] Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.732551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" event={"ID":"9cb55cbc-4698-4a34-aa2d-bead443c0784","Type":"ContainerStarted","Data":"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8"} Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.732581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" event={"ID":"9cb55cbc-4698-4a34-aa2d-bead443c0784","Type":"ContainerStarted","Data":"22e250f42397c59b47ad967340d7a30ee2e8d444b3ef1c9148e28a2f710ebc27"} Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.732922 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:11 crc kubenswrapper[4735]: W0317 01:13:11.733368 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf326c014_e2bb_4d5a_a0ac_580a61a041f0.slice/crio-a13d6b3021951b73ef13a57bf03fd9ce909c18c3d5732c236fdd715420564012 WatchSource:0}: Error finding container a13d6b3021951b73ef13a57bf03fd9ce909c18c3d5732c236fdd715420564012: Status 404 returned error can't find the container with id a13d6b3021951b73ef13a57bf03fd9ce909c18c3d5732c236fdd715420564012 Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.735447 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5jsnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.735491 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.735567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" event={"ID":"5afa8fe8-3078-4fcb-a84a-976b38223c48","Type":"ContainerStarted","Data":"b53d7d053a3f31724e787218f71240107fed03334ecf9655d5ea17cbf1f368df"} Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.735607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" event={"ID":"5afa8fe8-3078-4fcb-a84a-976b38223c48","Type":"ContainerStarted","Data":"36d2dddb2728f1ef186fbfdd8debabe619b1af8f8fb64885f6f38877512bf6e9"} Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.739446 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" event={"ID":"46967c4e-e4ac-403f-93b0-c2315a9a067f","Type":"ContainerStarted","Data":"0681868ec9c69e72a71290defeb881b20f314baf20680dbf9cc67b1515054561"} Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.740699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" event={"ID":"a0fdf8be-34e0-4ed0-b462-13b38cecfd73","Type":"ContainerStarted","Data":"77c984818f515796d56ac14216e378c11161133c03cfc13346cc0ea3f029667e"} Mar 17 01:13:11 crc kubenswrapper[4735]: W0317 01:13:11.744540 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce79315_fdfb_4384_b901_053a3482b27b.slice/crio-ae5f875d0a531cd9a29c25d350bf54c84ab24a89b7e0825b1dd6c5e3044792df WatchSource:0}: Error finding container ae5f875d0a531cd9a29c25d350bf54c84ab24a89b7e0825b1dd6c5e3044792df: Status 404 returned error can't find the container with id ae5f875d0a531cd9a29c25d350bf54c84ab24a89b7e0825b1dd6c5e3044792df Mar 17 01:13:11 crc kubenswrapper[4735]: I0317 01:13:11.845038 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqdp7"] Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.595489 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.596525 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.597061 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.597153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.597499 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.597651 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.598063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.598283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.598578 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.598651 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: E0317 01:13:12.599771 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.099742639 +0000 UTC m=+218.731975657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.606943 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.607405 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.653414 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.654865 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw"] Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.734652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:12 crc kubenswrapper[4735]: E0317 01:13:12.734740 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.234718613 +0000 UTC m=+218.866951591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3958599a-489b-430d-bbb4-56d7e88945d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735042 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3958599a-489b-430d-bbb4-56d7e88945d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735085 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-image-import-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a039ba69-6339-4e2d-a131-2018bfcc356d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735131 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-mountpoint-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprwb\" (UniqueName: \"kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb\") pod \"auto-csr-approver-29561832-p6vx6\" (UID: \"c4bd5744-869c-4763-af43-3ffcce4d549f\") " pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735241 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sg9\" (UniqueName: \"kubernetes.io/projected/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-kube-api-access-25sg9\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-encryption-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735596 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-node-pullsecrets\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735610 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-csi-data-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735649 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-plugins-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3958599a-489b-430d-bbb4-56d7e88945d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735691 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-proxy-tls\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-kube-api-access-82cbf\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pm8\" (UniqueName: \"kubernetes.io/projected/a039ba69-6339-4e2d-a131-2018bfcc356d-kube-api-access-v5pm8\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-registration-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735817 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-audit\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.735849 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.737639 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs75q\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.737947 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.737981 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrbc\" (UniqueName: \"kubernetes.io/projected/211ba440-450f-4fbe-aaf8-b540716338d1-kube-api-access-gnrbc\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738046 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738087 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-socket-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738253 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5h6r\" (UniqueName: \"kubernetes.io/projected/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-kube-api-access-b5h6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-serving-cert\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2g9n\" (UniqueName: \"kubernetes.io/projected/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-kube-api-access-m2g9n\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4cq\" (UniqueName: \"kubernetes.io/projected/a8197b9a-a197-4e40-bebb-e8308ec1c094-kube-api-access-jb4cq\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738469 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-cert\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-client\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.738533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-audit-dir\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: E0317 01:13:12.740179 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.24016776 +0000 UTC m=+218.872400738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.755925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.757974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" event={"ID":"772d764a-7cf4-443a-93b6-a84726c59355","Type":"ContainerStarted","Data":"822d5400921a917acd98219e015349d3edfd814413dfeb2d7c8356e139caeeb8"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.759741 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" event={"ID":"fce79315-fdfb-4384-b901-053a3482b27b","Type":"ContainerStarted","Data":"ae5f875d0a531cd9a29c25d350bf54c84ab24a89b7e0825b1dd6c5e3044792df"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.761085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" event={"ID":"46967c4e-e4ac-403f-93b0-c2315a9a067f","Type":"ContainerStarted","Data":"bf8de8900972d1ec49b2eda43813e2f41c9246a84a5707ee952bfd0f886d5bfe"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.765382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nh28b" event={"ID":"68d08823-e5f7-48eb-898e-3e59c772c8e9","Type":"ContainerStarted","Data":"9a00c32f856ffafffc0a65b10abe8d6f626d28422375f35c3f17337ef3cc4e80"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.809626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" event={"ID":"d1d16522-3b99-4ee4-b1ae-901b135c661d","Type":"ContainerStarted","Data":"b125992021c6e139d500fe94c08e78d500f1700bd1a39378bfdd8b5e3cc08544"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.811239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" event={"ID":"f326c014-e2bb-4d5a-a0ac-580a61a041f0","Type":"ContainerStarted","Data":"a13d6b3021951b73ef13a57bf03fd9ce909c18c3d5732c236fdd715420564012"} Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.812253 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5jsnf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.812290 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.835044 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf"] Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-webhook-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-cert\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839475 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-client\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-audit-dir\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839509 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbd537a4-457f-47fc-9ed0-07044027b0d5-metrics-tls\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839536 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwj9\" (UniqueName: \"kubernetes.io/projected/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-kube-api-access-8fwj9\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3958599a-489b-430d-bbb4-56d7e88945d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3958599a-489b-430d-bbb4-56d7e88945d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/125b8018-6680-4e83-b5b3-a8b177406d9d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839642 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-cabundle\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-image-import-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a039ba69-6339-4e2d-a131-2018bfcc356d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839692 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbb6r\" (UniqueName: \"kubernetes.io/projected/ecb29df1-b4b4-473a-8795-564a241ce156-kube-api-access-qbb6r\") pod \"migrator-59844c95c7-kcqhw\" (UID: \"ecb29df1-b4b4-473a-8795-564a241ce156\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpvbh\" (UniqueName: \"kubernetes.io/projected/76c60596-d8ac-452c-bf33-b35157ee0970-kube-api-access-mpvbh\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839736 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-mountpoint-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839752 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hprwb\" (UniqueName: \"kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb\") pod \"auto-csr-approver-29561832-p6vx6\" (UID: \"c4bd5744-869c-4763-af43-3ffcce4d549f\") " pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839786 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvj2t\" (UniqueName: \"kubernetes.io/projected/687efab0-3f01-453e-b202-46d47422c46d-kube-api-access-tvj2t\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-srv-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839851 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839899 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125b8018-6680-4e83-b5b3-a8b177406d9d-config\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sg9\" (UniqueName: \"kubernetes.io/projected/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-kube-api-access-25sg9\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839941 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpmn\" (UniqueName: \"kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.839967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfx8f\" (UniqueName: \"kubernetes.io/projected/dbd537a4-457f-47fc-9ed0-07044027b0d5-kube-api-access-mfx8f\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840071 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-encryption-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840096 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/76c60596-d8ac-452c-bf33-b35157ee0970-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-node-pullsecrets\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840127 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-csi-data-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-plugins-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-metrics-certs\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3958599a-489b-430d-bbb4-56d7e88945d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-proxy-tls\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-kube-api-access-82cbf\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pm8\" (UniqueName: \"kubernetes.io/projected/a039ba69-6339-4e2d-a131-2018bfcc356d-kube-api-access-v5pm8\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-registration-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840345 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj9zg\" (UniqueName: \"kubernetes.io/projected/a33da5ce-b959-43e3-a18c-99c8d5e7af40-kube-api-access-fj9zg\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-audit\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33da5ce-b959-43e3-a18c-99c8d5e7af40-serving-cert\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840416 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfdr\" (UniqueName: \"kubernetes.io/projected/edc97b39-8677-476a-b336-1645f172c219-kube-api-access-2zfdr\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjggq\" (UniqueName: \"kubernetes.io/projected/ae1e8782-bee4-4a76-837f-69e3735abc86-kube-api-access-wjggq\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs75q\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840485 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-service-ca-bundle\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840510 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrbc\" (UniqueName: \"kubernetes.io/projected/211ba440-450f-4fbe-aaf8-b540716338d1-kube-api-access-gnrbc\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-srv-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnm6\" (UniqueName: \"kubernetes.io/projected/11908f1a-0399-4964-86cc-8a2e51d35821-kube-api-access-zgnm6\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840655 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-socket-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840672 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae1e8782-bee4-4a76-837f-69e3735abc86-proxy-tls\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840738 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd537a4-457f-47fc-9ed0-07044027b0d5-config-volume\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/11908f1a-0399-4964-86cc-8a2e51d35821-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840784 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-default-certificate\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzw5l\" (UniqueName: \"kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-key\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-apiservice-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840848 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-stats-auth\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqf6n\" (UniqueName: \"kubernetes.io/projected/db479d6c-1a3c-4de1-86be-ba6806eac784-kube-api-access-vqf6n\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840907 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5h6r\" (UniqueName: \"kubernetes.io/projected/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-kube-api-access-b5h6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840941 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-node-bootstrap-token\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840956 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/125b8018-6680-4e83-b5b3-a8b177406d9d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-images\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.840997 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-serving-cert\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33da5ce-b959-43e3-a18c-99c8d5e7af40-config\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841052 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4299c484-7f23-433c-a4b9-d3daf740fa7c-tmpfs\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841073 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjml6\" (UniqueName: \"kubernetes.io/projected/4299c484-7f23-433c-a4b9-d3daf740fa7c-kube-api-access-bjml6\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cn4\" (UniqueName: \"kubernetes.io/projected/01be826f-c4c9-48db-9dd0-5f614a3781b8-kube-api-access-q8cn4\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2g9n\" (UniqueName: \"kubernetes.io/projected/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-kube-api-access-m2g9n\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4cq\" (UniqueName: \"kubernetes.io/projected/a8197b9a-a197-4e40-bebb-e8308ec1c094-kube-api-access-jb4cq\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-certs\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-node-pullsecrets\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-csi-data-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.841959 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-plugins-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: E0317 01:13:12.842394 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.342371622 +0000 UTC m=+218.974604600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.845722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-encryption-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.846122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-mountpoint-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.846625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-proxy-tls\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.848660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.850152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-config\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.850429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-client\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.850877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/211ba440-450f-4fbe-aaf8-b540716338d1-audit-dir\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.851312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.851607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-registration-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.851665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.852042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-audit\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.852689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.856384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.856988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/211ba440-450f-4fbe-aaf8-b540716338d1-image-import-ca\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.858136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.858226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3958599a-489b-430d-bbb4-56d7e88945d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.858816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.859504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8197b9a-a197-4e40-bebb-e8308ec1c094-socket-dir\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.860753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.861430 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.862933 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3958599a-489b-430d-bbb4-56d7e88945d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.863799 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3958599a-489b-430d-bbb4-56d7e88945d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r4wq7\" (UID: \"3958599a-489b-430d-bbb4-56d7e88945d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.870936 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sg9\" (UniqueName: \"kubernetes.io/projected/5d734bc7-81d8-456d-8a09-5e138fa3ab1c-kube-api-access-25sg9\") pod \"machine-config-controller-84d6567774-pqxxs\" (UID: \"5d734bc7-81d8-456d-8a09-5e138fa3ab1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.871027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.872481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-cert\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.873356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4cq\" (UniqueName: \"kubernetes.io/projected/a8197b9a-a197-4e40-bebb-e8308ec1c094-kube-api-access-jb4cq\") pod \"csi-hostpathplugin-vwwj9\" (UID: \"a8197b9a-a197-4e40-bebb-e8308ec1c094\") " pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.883555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211ba440-450f-4fbe-aaf8-b540716338d1-serving-cert\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.891714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a039ba69-6339-4e2d-a131-2018bfcc356d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.912467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrbc\" (UniqueName: \"kubernetes.io/projected/211ba440-450f-4fbe-aaf8-b540716338d1-kube-api-access-gnrbc\") pod \"apiserver-76f77b778f-qb49c\" (UID: \"211ba440-450f-4fbe-aaf8-b540716338d1\") " pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.912829 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprwb\" (UniqueName: \"kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb\") pod \"auto-csr-approver-29561832-p6vx6\" (UID: \"c4bd5744-869c-4763-af43-3ffcce4d549f\") " pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.915197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs75q\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.915607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2g9n\" (UniqueName: \"kubernetes.io/projected/d47d4aa8-edf6-48a1-ae0f-850cee2dc44f-kube-api-access-m2g9n\") pod \"ingress-canary-jjfm7\" (UID: \"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f\") " pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.921064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/748d17ea-e9d8-429c-8ed8-a94c3d6c23d4-kube-api-access-82cbf\") pod \"cluster-samples-operator-665b6dd947-2wn4z\" (UID: \"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.925972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5h6r\" (UniqueName: \"kubernetes.io/projected/b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b-kube-api-access-b5h6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-pjwxx\" (UID: \"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.927340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.939622 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.943220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pm8\" (UniqueName: \"kubernetes.io/projected/a039ba69-6339-4e2d-a131-2018bfcc356d-kube-api-access-v5pm8\") pod \"multus-admission-controller-857f4d67dd-sgbsk\" (UID: \"a039ba69-6339-4e2d-a131-2018bfcc356d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-srv-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnm6\" (UniqueName: \"kubernetes.io/projected/11908f1a-0399-4964-86cc-8a2e51d35821-kube-api-access-zgnm6\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae1e8782-bee4-4a76-837f-69e3735abc86-proxy-tls\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd537a4-457f-47fc-9ed0-07044027b0d5-config-volume\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/11908f1a-0399-4964-86cc-8a2e51d35821-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944882 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-default-certificate\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.944951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzw5l\" (UniqueName: \"kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945022 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-key\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-apiservice-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-stats-auth\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqf6n\" (UniqueName: \"kubernetes.io/projected/db479d6c-1a3c-4de1-86be-ba6806eac784-kube-api-access-vqf6n\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945290 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-node-bootstrap-token\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/125b8018-6680-4e83-b5b3-a8b177406d9d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945417 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-images\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33da5ce-b959-43e3-a18c-99c8d5e7af40-config\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4299c484-7f23-433c-a4b9-d3daf740fa7c-tmpfs\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjml6\" (UniqueName: \"kubernetes.io/projected/4299c484-7f23-433c-a4b9-d3daf740fa7c-kube-api-access-bjml6\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945777 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cn4\" (UniqueName: \"kubernetes.io/projected/01be826f-c4c9-48db-9dd0-5f614a3781b8-kube-api-access-q8cn4\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.945979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-certs\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-webhook-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbd537a4-457f-47fc-9ed0-07044027b0d5-metrics-tls\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwj9\" (UniqueName: \"kubernetes.io/projected/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-kube-api-access-8fwj9\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/125b8018-6680-4e83-b5b3-a8b177406d9d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-cabundle\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946639 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbb6r\" (UniqueName: \"kubernetes.io/projected/ecb29df1-b4b4-473a-8795-564a241ce156-kube-api-access-qbb6r\") pod \"migrator-59844c95c7-kcqhw\" (UID: \"ecb29df1-b4b4-473a-8795-564a241ce156\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpvbh\" (UniqueName: \"kubernetes.io/projected/76c60596-d8ac-452c-bf33-b35157ee0970-kube-api-access-mpvbh\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvj2t\" (UniqueName: \"kubernetes.io/projected/687efab0-3f01-453e-b202-46d47422c46d-kube-api-access-tvj2t\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-srv-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125b8018-6680-4e83-b5b3-a8b177406d9d-config\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.946986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpmn\" (UniqueName: \"kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfx8f\" (UniqueName: \"kubernetes.io/projected/dbd537a4-457f-47fc-9ed0-07044027b0d5-kube-api-access-mfx8f\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/76c60596-d8ac-452c-bf33-b35157ee0970-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947106 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-metrics-certs\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947124 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj9zg\" (UniqueName: \"kubernetes.io/projected/a33da5ce-b959-43e3-a18c-99c8d5e7af40-kube-api-access-fj9zg\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33da5ce-b959-43e3-a18c-99c8d5e7af40-serving-cert\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfdr\" (UniqueName: \"kubernetes.io/projected/edc97b39-8677-476a-b336-1645f172c219-kube-api-access-2zfdr\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947204 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjggq\" (UniqueName: \"kubernetes.io/projected/ae1e8782-bee4-4a76-837f-69e3735abc86-kube-api-access-wjggq\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-service-ca-bundle\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.947822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-service-ca-bundle\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.958717 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd537a4-457f-47fc-9ed0-07044027b0d5-config-volume\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.964179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.965820 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33da5ce-b959-43e3-a18c-99c8d5e7af40-config\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.966631 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-srv-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.967051 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae1e8782-bee4-4a76-837f-69e3735abc86-proxy-tls\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: E0317 01:13:12.967384 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.467368535 +0000 UTC m=+219.099601513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.967528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/76c60596-d8ac-452c-bf33-b35157ee0970-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.969060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-key\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.969500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-metrics-certs\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.969704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.969995 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.973985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-srv-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.974483 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125b8018-6680-4e83-b5b3-a8b177406d9d-config\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.975522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33da5ce-b959-43e3-a18c-99c8d5e7af40-serving-cert\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.976636 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.980501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzw5l\" (UniqueName: \"kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l\") pod \"marketplace-operator-79b997595-4rpxx\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.981088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnm6\" (UniqueName: \"kubernetes.io/projected/11908f1a-0399-4964-86cc-8a2e51d35821-kube-api-access-zgnm6\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.991647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-default-certificate\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.992309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db479d6c-1a3c-4de1-86be-ba6806eac784-signing-cabundle\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.994893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-node-bootstrap-token\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.986463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-stats-auth\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.995670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4299c484-7f23-433c-a4b9-d3daf740fa7c-tmpfs\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.996957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-webhook-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:12 crc kubenswrapper[4735]: I0317 01:13:12.998334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpvbh\" (UniqueName: \"kubernetes.io/projected/76c60596-d8ac-452c-bf33-b35157ee0970-kube-api-access-mpvbh\") pod \"control-plane-machine-set-operator-78cbb6b69f-hsd5h\" (UID: \"76c60596-d8ac-452c-bf33-b35157ee0970\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:12.946533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.001563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.002767 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01be826f-c4c9-48db-9dd0-5f614a3781b8-certs\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.003787 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2jwr2" podStartSLOduration=153.003762268 podStartE2EDuration="2m33.003762268s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:12.939151568 +0000 UTC m=+218.571384546" watchObservedRunningTime="2026-03-17 01:13:13.003762268 +0000 UTC m=+218.635995246" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.004485 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae1e8782-bee4-4a76-837f-69e3735abc86-images\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.007025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4299c484-7f23-433c-a4b9-d3daf740fa7c-apiservice-cert\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.008322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/11908f1a-0399-4964-86cc-8a2e51d35821-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xl7d\" (UID: \"11908f1a-0399-4964-86cc-8a2e51d35821\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.023689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cn4\" (UniqueName: \"kubernetes.io/projected/01be826f-c4c9-48db-9dd0-5f614a3781b8-kube-api-access-q8cn4\") pod \"machine-config-server-r87ml\" (UID: \"01be826f-c4c9-48db-9dd0-5f614a3781b8\") " pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.026193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpmn\" (UniqueName: \"kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn\") pod \"collect-profiles-29561820-2vw78\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.034578 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edc97b39-8677-476a-b336-1645f172c219-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.055310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/125b8018-6680-4e83-b5b3-a8b177406d9d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.065002 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.066575 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.566552182 +0000 UTC m=+219.198785160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.074567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.075459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/687efab0-3f01-453e-b202-46d47422c46d-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.081418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.105780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbd537a4-457f-47fc-9ed0-07044027b0d5-metrics-tls\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.109537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqf6n\" (UniqueName: \"kubernetes.io/projected/db479d6c-1a3c-4de1-86be-ba6806eac784-kube-api-access-vqf6n\") pod \"service-ca-9c57cc56f-cwkhd\" (UID: \"db479d6c-1a3c-4de1-86be-ba6806eac784\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.109970 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.110891 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.113877 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.114529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfx8f\" (UniqueName: \"kubernetes.io/projected/dbd537a4-457f-47fc-9ed0-07044027b0d5-kube-api-access-mfx8f\") pod \"dns-default-tn9pc\" (UID: \"dbd537a4-457f-47fc-9ed0-07044027b0d5\") " pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.130259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj9zg\" (UniqueName: \"kubernetes.io/projected/a33da5ce-b959-43e3-a18c-99c8d5e7af40-kube-api-access-fj9zg\") pod \"service-ca-operator-777779d784-qdv79\" (UID: \"a33da5ce-b959-43e3-a18c-99c8d5e7af40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.137442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfdr\" (UniqueName: \"kubernetes.io/projected/edc97b39-8677-476a-b336-1645f172c219-kube-api-access-2zfdr\") pod \"olm-operator-6b444d44fb-kr886\" (UID: \"edc97b39-8677-476a-b336-1645f172c219\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.144586 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.162876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.163704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjggq\" (UniqueName: \"kubernetes.io/projected/ae1e8782-bee4-4a76-837f-69e3735abc86-kube-api-access-wjggq\") pod \"machine-config-operator-74547568cd-4rx6t\" (UID: \"ae1e8782-bee4-4a76-837f-69e3735abc86\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.178980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.180330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.180664 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.680649203 +0000 UTC m=+219.312882181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.183221 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvj2t\" (UniqueName: \"kubernetes.io/projected/687efab0-3f01-453e-b202-46d47422c46d-kube-api-access-tvj2t\") pod \"catalog-operator-68c6474976-mk2xg\" (UID: \"687efab0-3f01-453e-b202-46d47422c46d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.183980 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.193764 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjml6\" (UniqueName: \"kubernetes.io/projected/4299c484-7f23-433c-a4b9-d3daf740fa7c-kube-api-access-bjml6\") pod \"packageserver-d55dfcdfc-rxdbc\" (UID: \"4299c484-7f23-433c-a4b9-d3daf740fa7c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.201069 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.201501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwj9\" (UniqueName: \"kubernetes.io/projected/87ef90f4-2d13-4f4c-9bbc-0b438b75a901-kube-api-access-8fwj9\") pod \"router-default-5444994796-s2dvf\" (UID: \"87ef90f4-2d13-4f4c-9bbc-0b438b75a901\") " pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.202450 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/125b8018-6680-4e83-b5b3-a8b177406d9d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dg6b7\" (UID: \"125b8018-6680-4e83-b5b3-a8b177406d9d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.203425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbb6r\" (UniqueName: \"kubernetes.io/projected/ecb29df1-b4b4-473a-8795-564a241ce156-kube-api-access-qbb6r\") pod \"migrator-59844c95c7-kcqhw\" (UID: \"ecb29df1-b4b4-473a-8795-564a241ce156\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.209062 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.209502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.215561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jjfm7" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.226055 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r87ml" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.243326 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.293130 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.294019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.294270 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.794247791 +0000 UTC m=+219.426480769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.358267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.361975 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.369691 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.380878 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qcs85"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.384841 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.395045 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.395318 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.895304684 +0000 UTC m=+219.527537662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.395625 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.425495 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.435735 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.475318 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.495694 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.496127 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:13.996107931 +0000 UTC m=+219.628340909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.598633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.599244 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.099224127 +0000 UTC m=+219.731457105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.612711 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.651552 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lsrzv"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.687214 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8rdqg"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.702879 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.703301 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.203281175 +0000 UTC m=+219.835514153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.721391 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr62z"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.725014 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:13:13 crc kubenswrapper[4735]: W0317 01:13:13.745899 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3958599a_489b_430d_bbb4_56d7e88945d5.slice/crio-efe84690539f1266342cb2aaf8dd6404b3975fc01634b9804ae91b1ca326dfb8 WatchSource:0}: Error finding container efe84690539f1266342cb2aaf8dd6404b3975fc01634b9804ae91b1ca326dfb8: Status 404 returned error can't find the container with id efe84690539f1266342cb2aaf8dd6404b3975fc01634b9804ae91b1ca326dfb8 Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.763782 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" podStartSLOduration=152.763760401 podStartE2EDuration="2m32.763760401s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:13.75849142 +0000 UTC m=+219.390724398" watchObservedRunningTime="2026-03-17 01:13:13.763760401 +0000 UTC m=+219.395993379" Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.778784 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.791225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwwj9"] Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.804766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.805117 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.305102778 +0000 UTC m=+219.937335756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: W0317 01:13:13.869263 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e8ca98_69e8_409a_acc1_85ee705dfede.slice/crio-8ca406724953e5565a219392314fa5167fa058ba229e0fe3aca8fa237174b29d WatchSource:0}: Error finding container 8ca406724953e5565a219392314fa5167fa058ba229e0fe3aca8fa237174b29d: Status 404 returned error can't find the container with id 8ca406724953e5565a219392314fa5167fa058ba229e0fe3aca8fa237174b29d Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.878919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nh28b" event={"ID":"68d08823-e5f7-48eb-898e-3e59c772c8e9","Type":"ContainerStarted","Data":"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9"} Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.883929 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" event={"ID":"eb41bde3-7791-4ac9-bda5-8c094ab6c904","Type":"ContainerStarted","Data":"718f2bb5a53109f2955c69c6f98d6fe147a616df7713574f2ee15fca8c8fa0e0"} Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.891556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" event={"ID":"fce79315-fdfb-4384-b901-053a3482b27b","Type":"ContainerStarted","Data":"97990571ec9b00d178403290549da339bdadeac3688aeb7b54ea7bf6a427b4f6"} Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.894347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" event={"ID":"8839a674-43ea-4a76-a2c5-b56261bac28d","Type":"ContainerStarted","Data":"cd2b88a34680c53a74d3f0973999a03642afea53273d5795dacea6250b729874"} Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.911360 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:13 crc kubenswrapper[4735]: E0317 01:13:13.917630 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.417598068 +0000 UTC m=+220.049831046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.929199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" event={"ID":"d1d16522-3b99-4ee4-b1ae-901b135c661d","Type":"ContainerStarted","Data":"61f4e6c2c5aabfdc549bfcf2d59541d60d1b4f05f62770a0e99ee04e0dc7f980"} Mar 17 01:13:13 crc kubenswrapper[4735]: I0317 01:13:13.997874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" event={"ID":"3958599a-489b-430d-bbb4-56d7e88945d5","Type":"ContainerStarted","Data":"efe84690539f1266342cb2aaf8dd6404b3975fc01634b9804ae91b1ca326dfb8"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.013332 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nh28b" podStartSLOduration=154.013305918 podStartE2EDuration="2m34.013305918s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:13.920347787 +0000 UTC m=+219.552580785" watchObservedRunningTime="2026-03-17 01:13:14.013305918 +0000 UTC m=+219.645538896" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.027036 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.027892 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.527874763 +0000 UTC m=+220.160107741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.044804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" event={"ID":"1868585d-68a4-4a7f-85c2-fd1c4eef0532","Type":"ContainerStarted","Data":"3f268c33508cfb6d7a13bd249648b5ba5cb6722ef2879f85ec3cf4c99422379d"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.046184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" event={"ID":"772d764a-7cf4-443a-93b6-a84726c59355","Type":"ContainerStarted","Data":"e312a1e072e4f9a538c78df268f57b991ca365fd300256fe59dced95428b885f"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.072473 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-scxl7" podStartSLOduration=153.072452721 podStartE2EDuration="2m33.072452721s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:14.015732268 +0000 UTC m=+219.647965246" watchObservedRunningTime="2026-03-17 01:13:14.072452721 +0000 UTC m=+219.704685699" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.072822 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sgbsk"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.135675 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.136731 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.636709222 +0000 UTC m=+220.268942200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: W0317 01:13:14.165993 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01be826f_c4c9_48db_9dd0_5f614a3781b8.slice/crio-7c5156546a15c5bd2ec1de84187cd3d942910af7eb14aedbc35cf73e17b5f59f WatchSource:0}: Error finding container 7c5156546a15c5bd2ec1de84187cd3d942910af7eb14aedbc35cf73e17b5f59f: Status 404 returned error can't find the container with id 7c5156546a15c5bd2ec1de84187cd3d942910af7eb14aedbc35cf73e17b5f59f Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.172167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" event={"ID":"46967c4e-e4ac-403f-93b0-c2315a9a067f","Type":"ContainerStarted","Data":"9b7f45318cb81c088cdc3f6a4418e3203b588b2252f9e145c1076c742b921d77"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.231543 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rh7dw" podStartSLOduration=154.231522708 podStartE2EDuration="2m34.231522708s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:14.14346104 +0000 UTC m=+219.775694018" watchObservedRunningTime="2026-03-17 01:13:14.231522708 +0000 UTC m=+219.863755686" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.231739 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p697v" podStartSLOduration=154.231734374 podStartE2EDuration="2m34.231734374s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:14.229406866 +0000 UTC m=+219.861639844" watchObservedRunningTime="2026-03-17 01:13:14.231734374 +0000 UTC m=+219.863967342" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.238094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" event={"ID":"a0fdf8be-34e0-4ed0-b462-13b38cecfd73","Type":"ContainerStarted","Data":"33ae540d90b1bd0485f8b4ce31c933e61f4c61f8980be75fd1040618380d91f8"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.239076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.240195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.248106 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.748066723 +0000 UTC m=+220.380299701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.259038 4735 patch_prober.go:28] interesting pod/console-operator-58897d9998-j2xfj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.259089 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" podUID="a0fdf8be-34e0-4ed0-b462-13b38cecfd73" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.261347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" event={"ID":"f326c014-e2bb-4d5a-a0ac-580a61a041f0","Type":"ContainerStarted","Data":"675847c0d98a131d9338742cc981e2d29a2ca77cd6f4526b8d802c6ba8eef8e2"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.261829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.275128 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" event={"ID":"8e1db38b-1243-4309-8a2e-961697fc030a","Type":"ContainerStarted","Data":"fa6a764395b17c37ca4cd68b8005ba3130c9f97b038403af116dd0e53960c319"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.280401 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.301745 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.305163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" event={"ID":"405c316c-f04d-4e00-9009-35dbce4a94c6","Type":"ContainerStarted","Data":"65f8c8579cbc311bf26492d4989a8b7205ba3d3a845d41247ae7b6399d6303d0"} Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.305441 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qb49c"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.306804 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" podStartSLOduration=154.30615565 podStartE2EDuration="2m34.30615565s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:14.288989439 +0000 UTC m=+219.921222417" watchObservedRunningTime="2026-03-17 01:13:14.30615565 +0000 UTC m=+219.938388628" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.333032 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.334172 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" podStartSLOduration=153.334145342 podStartE2EDuration="2m33.334145342s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:14.324209403 +0000 UTC m=+219.956442381" watchObservedRunningTime="2026-03-17 01:13:14.334145342 +0000 UTC m=+219.966378340" Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.342456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.344400 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.844362287 +0000 UTC m=+220.476595265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.393641 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.445247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.445611 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:14.945595745 +0000 UTC m=+220.577828723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: W0317 01:13:14.533944 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e7fc1d_5357_4cb7_a5cf_4c6a47e5a00b.slice/crio-b4ec2e2f2a42c590f3fa60970fa677eac532e57d5f48cb33655345c46ec20785 WatchSource:0}: Error finding container b4ec2e2f2a42c590f3fa60970fa677eac532e57d5f48cb33655345c46ec20785: Status 404 returned error can't find the container with id b4ec2e2f2a42c590f3fa60970fa677eac532e57d5f48cb33655345c46ec20785 Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.547278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.547639 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.047618743 +0000 UTC m=+220.679851721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.661691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.662670 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.162651607 +0000 UTC m=+220.794884585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.712920 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tn9pc"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.747009 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886"] Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.764370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.764486 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.26446784 +0000 UTC m=+220.896700818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.764775 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.765223 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.265215099 +0000 UTC m=+220.897448077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.867001 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.867193 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.367159554 +0000 UTC m=+220.999392532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.868689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.869259 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.369227056 +0000 UTC m=+221.001460034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:14 crc kubenswrapper[4735]: I0317 01:13:14.971640 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:14 crc kubenswrapper[4735]: E0317 01:13:14.972553 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.472534106 +0000 UTC m=+221.104767084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: W0317 01:13:15.037623 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd537a4_457f_47fc_9ed0_07044027b0d5.slice/crio-ac5e0e23e0aa94ae63fb3c15154199a459703a2845202b210e60407f2b0caaa4 WatchSource:0}: Error finding container ac5e0e23e0aa94ae63fb3c15154199a459703a2845202b210e60407f2b0caaa4: Status 404 returned error can't find the container with id ac5e0e23e0aa94ae63fb3c15154199a459703a2845202b210e60407f2b0caaa4 Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.107601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.109537 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.60951657 +0000 UTC m=+221.241749548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.208982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.209807 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.709785364 +0000 UTC m=+221.342018342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.321776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.322138 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.82212278 +0000 UTC m=+221.454355758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.368749 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.426823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.427237 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:15.927218554 +0000 UTC m=+221.559451532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.490246 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.504924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" event={"ID":"96e8ca98-69e8-409a-acc1-85ee705dfede","Type":"ContainerStarted","Data":"8ca406724953e5565a219392314fa5167fa058ba229e0fe3aca8fa237174b29d"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.525555 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.528274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.528630 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.028614477 +0000 UTC m=+221.660847455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.545758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r87ml" event={"ID":"01be826f-c4c9-48db-9dd0-5f614a3781b8","Type":"ContainerStarted","Data":"7c5156546a15c5bd2ec1de84187cd3d942910af7eb14aedbc35cf73e17b5f59f"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.628966 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.629379 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.129358823 +0000 UTC m=+221.761591801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.632134 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" event={"ID":"edc97b39-8677-476a-b336-1645f172c219","Type":"ContainerStarted","Data":"b68faf6886062c7b99af26e99b978958f864f9c9d820123ea7259bf7fcde9790"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.654814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" event={"ID":"a039ba69-6339-4e2d-a131-2018bfcc356d","Type":"ContainerStarted","Data":"f586a8673ce3211278a34375c0cd49c16879fb66c16e860382ccf63e132a08d6"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.657667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" event={"ID":"211ba440-450f-4fbe-aaf8-b540716338d1","Type":"ContainerStarted","Data":"b5738495bc66fb1ec6811588c990d6fe1e4ac8502d9c7034a393e564a9591045"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.662667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" event={"ID":"1868585d-68a4-4a7f-85c2-fd1c4eef0532","Type":"ContainerStarted","Data":"0402254fdead0515e07a53815634fb35468a35108bed6551fa35fc65d64aa382"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.730391 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.734593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.735022 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.235005711 +0000 UTC m=+221.867238689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.750670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" event={"ID":"6bdc2a09-1108-46eb-bfad-41ddee55a93c","Type":"ContainerStarted","Data":"f3b7ea804ccf5c92e1e809a02e83f8ee868ae9ef311fafcae276fd969773cd00"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.751300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.753779 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdv79"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.781309 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zgxs6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.781382 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.821221 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s2dvf" event={"ID":"87ef90f4-2d13-4f4c-9bbc-0b438b75a901","Type":"ContainerStarted","Data":"c4d1b4431cb9196bd852191dfc0ac7ca799890e16093be84fd9975e7596acaab"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.821271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s2dvf" event={"ID":"87ef90f4-2d13-4f4c-9bbc-0b438b75a901","Type":"ContainerStarted","Data":"9f9f536c65f0bc3e7dc202cfe4ed58490f15b2c6af5d3e3a179d243101c16111"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.835284 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.837637 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.337606363 +0000 UTC m=+221.969839521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.872814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" event={"ID":"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c","Type":"ContainerStarted","Data":"f68c47ed97796425f9488ce2ad4fae98b5c5f36b8ce344b4add434ad07238311"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.925090 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.939813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:15 crc kubenswrapper[4735]: E0317 01:13:15.940383 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.44036867 +0000 UTC m=+222.072601648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.943929 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc"] Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.958181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tn9pc" event={"ID":"dbd537a4-457f-47fc-9ed0-07044027b0d5","Type":"ContainerStarted","Data":"ac5e0e23e0aa94ae63fb3c15154199a459703a2845202b210e60407f2b0caaa4"} Mar 17 01:13:15 crc kubenswrapper[4735]: I0317 01:13:15.976491 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwkhd"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.053497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lsrzv" event={"ID":"60ccc3db-c677-47ff-8113-5eb103809a4a","Type":"ContainerStarted","Data":"59fdc6bb0400301d1a31cba4759b9ecac87cdb84540cc4ba9d3af878a7ffeebf"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.059007 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.059064 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-p6vx6"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.071004 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.071779 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.571755384 +0000 UTC m=+222.203988362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.071832 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.101247 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.101471 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:16 crc kubenswrapper[4735]: W0317 01:13:16.135686 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c60596_d8ac_452c_bf33_b35157ee0970.slice/crio-0e8f729bddd4e33318c1f565d7af6de56b1f3281f77e76e85185077184425a95 WatchSource:0}: Error finding container 0e8f729bddd4e33318c1f565d7af6de56b1f3281f77e76e85185077184425a95: Status 404 returned error can't find the container with id 0e8f729bddd4e33318c1f565d7af6de56b1f3281f77e76e85185077184425a95 Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.146591 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" podStartSLOduration=156.146566879 podStartE2EDuration="2m36.146566879s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.104292379 +0000 UTC m=+221.736525357" watchObservedRunningTime="2026-03-17 01:13:16.146566879 +0000 UTC m=+221.778799857" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.167254 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-s2dvf" podStartSLOduration=155.167233137 podStartE2EDuration="2m35.167233137s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.134978938 +0000 UTC m=+221.767211926" watchObservedRunningTime="2026-03-17 01:13:16.167233137 +0000 UTC m=+221.799466105" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.168498 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.168540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" event={"ID":"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b","Type":"ContainerStarted","Data":"b4ec2e2f2a42c590f3fa60970fa677eac532e57d5f48cb33655345c46ec20785"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.173447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.173917 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.673900114 +0000 UTC m=+222.306133092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: W0317 01:13:16.231142 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb479d6c_1a3c_4de1_86be_ba6806eac784.slice/crio-55c787e1294dc85c09fdde222673120f443721c41e604c9836a93f66879c5b30 WatchSource:0}: Error finding container 55c787e1294dc85c09fdde222673120f443721c41e604c9836a93f66879c5b30: Status 404 returned error can't find the container with id 55c787e1294dc85c09fdde222673120f443721c41e604c9836a93f66879c5b30 Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.232630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" event={"ID":"d1d16522-3b99-4ee4-b1ae-901b135c661d","Type":"ContainerStarted","Data":"936671916dff1a8fa6a6ff1eade0ac1c1a5c2af63885ffcc2096725831c6a98f"} Mar 17 01:13:16 crc kubenswrapper[4735]: W0317 01:13:16.236088 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687efab0_3f01_453e_b202_46d47422c46d.slice/crio-62a775f529dacb9e567b2141d92a7f3c4ae9f14a7355a28cb9633208d054d298 WatchSource:0}: Error finding container 62a775f529dacb9e567b2141d92a7f3c4ae9f14a7355a28cb9633208d054d298: Status 404 returned error can't find the container with id 62a775f529dacb9e567b2141d92a7f3c4ae9f14a7355a28cb9633208d054d298 Mar 17 01:13:16 crc kubenswrapper[4735]: W0317 01:13:16.241053 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4bd5744_869c_4763_af43_3ffcce4d549f.slice/crio-789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e WatchSource:0}: Error finding container 789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e: Status 404 returned error can't find the container with id 789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.282698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" event={"ID":"a8197b9a-a197-4e40-bebb-e8308ec1c094","Type":"ContainerStarted","Data":"2880a30d2432671a41dba51f897449d84abd033e7adaf58824547e215e216c5d"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.283323 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.284682 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.784660071 +0000 UTC m=+222.416893049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.293156 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.296241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jjfm7"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.312513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" event={"ID":"83a10c7e-ff3c-41b6-adb7-14243953ed5a","Type":"ContainerStarted","Data":"a2d5f1145340d04d84027862955a351a6919bba8c9132ad4e9d8793625ad0f69"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.327701 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lsrzv" podStartSLOduration=156.327671049 podStartE2EDuration="2m36.327671049s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.293490122 +0000 UTC m=+221.925723100" watchObservedRunningTime="2026-03-17 01:13:16.327671049 +0000 UTC m=+221.959904027" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.329044 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw"] Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.359758 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" podStartSLOduration=155.359739783 podStartE2EDuration="2m35.359739783s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.359035806 +0000 UTC m=+221.991268784" watchObservedRunningTime="2026-03-17 01:13:16.359739783 +0000 UTC m=+221.991972761" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.385175 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.386465 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.886452152 +0000 UTC m=+222.518685130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.386495 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.388065 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.388130 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.392742 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" event={"ID":"405c316c-f04d-4e00-9009-35dbce4a94c6","Type":"ContainerStarted","Data":"c118ce02570e115c3aaa5548f8c9ee40e3ebc2ae5baa238c16b88be0299f01ce"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.422219 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqdp7" podStartSLOduration=155.422202559 podStartE2EDuration="2m35.422202559s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.421237595 +0000 UTC m=+222.053470563" watchObservedRunningTime="2026-03-17 01:13:16.422202559 +0000 UTC m=+222.054435537" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.435293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" event={"ID":"11908f1a-0399-4964-86cc-8a2e51d35821","Type":"ContainerStarted","Data":"22953969502e94e2afcc9ff2c39c8a186a9644c70a7f184bafabe8715c3d5ed7"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.469632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" event={"ID":"eb41bde3-7791-4ac9-bda5-8c094ab6c904","Type":"ContainerStarted","Data":"72e076d16a41ea4cadfcf93d3c6203fccd8b0f080db1d61c504985674e0361bd"} Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.485414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-j2xfj" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.485806 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.485986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.985959778 +0000 UTC m=+222.618192766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.486445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.486493 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.486542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.486582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.486604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.496593 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:16.996573853 +0000 UTC m=+222.628806831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.506575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.507989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.518479 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.534585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.573706 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c56vf" podStartSLOduration=155.573678696 podStartE2EDuration="2m35.573678696s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.573351888 +0000 UTC m=+222.205584866" watchObservedRunningTime="2026-03-17 01:13:16.573678696 +0000 UTC m=+222.205911674" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.589053 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.590601 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.0905695 +0000 UTC m=+222.722802478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.653133 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.667155 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.668553 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60420: no serving certificate available for the kubelet" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.689419 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.694764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.695059 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.195045819 +0000 UTC m=+222.827278797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.800983 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.801173 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.301138829 +0000 UTC m=+222.933371807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.805206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.805934 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.305912459 +0000 UTC m=+222.938145437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.823583 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60428: no serving certificate available for the kubelet" Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.908263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.909025 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.408993623 +0000 UTC m=+223.041226601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.920056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:16 crc kubenswrapper[4735]: E0317 01:13:16.920821 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.420799929 +0000 UTC m=+223.053032907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:16 crc kubenswrapper[4735]: I0317 01:13:16.952951 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60444: no serving certificate available for the kubelet" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.021426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.022468 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.522408347 +0000 UTC m=+223.154641345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.085424 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60458: no serving certificate available for the kubelet" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.131875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.133273 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.633253555 +0000 UTC m=+223.265486533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.205819 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60472: no serving certificate available for the kubelet" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.235536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.235871 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.735837267 +0000 UTC m=+223.368070245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.340075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.340423 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.840409889 +0000 UTC m=+223.472642867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.405095 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:17 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:17 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:17 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.405619 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.442397 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.443235 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:17.943216906 +0000 UTC m=+223.575449884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.486595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" event={"ID":"a33da5ce-b959-43e3-a18c-99c8d5e7af40","Type":"ContainerStarted","Data":"07ddeff52c5fe8b669a6f4f71b8a1cb1ba9a0c701f72cf763a73cdc866844c48"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.505030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" event={"ID":"edc97b39-8677-476a-b336-1645f172c219","Type":"ContainerStarted","Data":"348e017e8edc93b33ed2fed9e9209c240b812f75bf5c4674fb98cfab5736270f"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.506097 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.514592 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" event={"ID":"a039ba69-6339-4e2d-a131-2018bfcc356d","Type":"ContainerStarted","Data":"abef68724c6118684a61316c4fcbfe29be2522011d4e66e73d3921f2d7814560"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.514706 4735 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kr886 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.514743 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" podUID="edc97b39-8677-476a-b336-1645f172c219" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.523388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" event={"ID":"db479d6c-1a3c-4de1-86be-ba6806eac784","Type":"ContainerStarted","Data":"55c787e1294dc85c09fdde222673120f443721c41e604c9836a93f66879c5b30"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.533257 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" event={"ID":"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c","Type":"ContainerStarted","Data":"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.534296 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.538607 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4rpxx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.538655 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.541840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" event={"ID":"125b8018-6680-4e83-b5b3-a8b177406d9d","Type":"ContainerStarted","Data":"516d179a79512c6943f4dbb68afc06640b9e746bdc04706581ba03b0ecc595ca"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.544268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.544617 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.044605198 +0000 UTC m=+223.676838176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.555203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" event={"ID":"5d734bc7-81d8-456d-8a09-5e138fa3ab1c","Type":"ContainerStarted","Data":"dd81485c4a3b49145d1267e13603c9c71a192f1d3f03bbce00fdd70327393a7c"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.555977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" event={"ID":"4299c484-7f23-433c-a4b9-d3daf740fa7c","Type":"ContainerStarted","Data":"30cef4683579d91a0d02d9954db526b9c872d552ed0b966182adef77ab7dac47"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.556598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" event={"ID":"687efab0-3f01-453e-b202-46d47422c46d","Type":"ContainerStarted","Data":"62a775f529dacb9e567b2141d92a7f3c4ae9f14a7355a28cb9633208d054d298"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.557913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" event={"ID":"ae1e8782-bee4-4a76-837f-69e3735abc86","Type":"ContainerStarted","Data":"e7f708b0c87f992356bf32cb4615faefe97ae4307a1e2098f4a831f5e02162c7"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.569438 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60486: no serving certificate available for the kubelet" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.599802 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j76kq" podStartSLOduration=156.599787261 podStartE2EDuration="2m36.599787261s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:16.719030151 +0000 UTC m=+222.351263119" watchObservedRunningTime="2026-03-17 01:13:17.599787261 +0000 UTC m=+223.232020239" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.600980 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" podStartSLOduration=156.600973651 podStartE2EDuration="2m36.600973651s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.599191937 +0000 UTC m=+223.231424915" watchObservedRunningTime="2026-03-17 01:13:17.600973651 +0000 UTC m=+223.233206629" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.613456 4735 generic.go:334] "Generic (PLEG): container finished" podID="96e8ca98-69e8-409a-acc1-85ee705dfede" containerID="42a7b3911dcec3bb1bab8bf39f494d728cce2d8b9ccfcb7d319be35917bab93a" exitCode=0 Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.613584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" event={"ID":"96e8ca98-69e8-409a-acc1-85ee705dfede","Type":"ContainerDied","Data":"42a7b3911dcec3bb1bab8bf39f494d728cce2d8b9ccfcb7d319be35917bab93a"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.630938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pjwxx" event={"ID":"b9e7fc1d-5357-4cb7-a5cf-4c6a47e5a00b","Type":"ContainerStarted","Data":"c513d5658ed33c52e93c5e4dd8ad31f921d82d45fedd408fdd69809c212d19a6"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.646535 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.647536 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" podStartSLOduration=156.647513768 podStartE2EDuration="2m36.647513768s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.643124938 +0000 UTC m=+223.275357926" watchObservedRunningTime="2026-03-17 01:13:17.647513768 +0000 UTC m=+223.279746746" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.647916 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.147878337 +0000 UTC m=+223.780111315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.648416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" event={"ID":"76c60596-d8ac-452c-bf33-b35157ee0970","Type":"ContainerStarted","Data":"0e8f729bddd4e33318c1f565d7af6de56b1f3281f77e76e85185077184425a95"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.650040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" event={"ID":"ecb29df1-b4b4-473a-8795-564a241ce156","Type":"ContainerStarted","Data":"73bbfac3824d0b337a1cb33074da45dfda2d4821a39f2e8c8e2955d34bdbf794"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.656810 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" event={"ID":"1868585d-68a4-4a7f-85c2-fd1c4eef0532","Type":"ContainerStarted","Data":"78c9f125ae748358761af72ce7eea3e8443207d3fdd9ec1d5cba2c0133cc758e"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.668232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lsrzv" event={"ID":"60ccc3db-c677-47ff-8113-5eb103809a4a","Type":"ContainerStarted","Data":"c78915c769b829d8cff9315f90641fd2429e47e61a52e2e28cbae070702254ad"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.669711 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.669747 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.677477 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jjfm7" event={"ID":"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f","Type":"ContainerStarted","Data":"d938d81a0c1a605186ed1b693613d097ec5e4f7e8290d83cd82f379c05945e65"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.695148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" event={"ID":"6bdc2a09-1108-46eb-bfad-41ddee55a93c","Type":"ContainerStarted","Data":"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.697636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" event={"ID":"c4bd5744-869c-4763-af43-3ffcce4d549f","Type":"ContainerStarted","Data":"789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.730656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" event={"ID":"11908f1a-0399-4964-86cc-8a2e51d35821","Type":"ContainerStarted","Data":"8f9d2e36f4e39920670de36b3a59fc805d817a95cc8145bd9a7d47c5f905c5e5"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.747082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r87ml" event={"ID":"01be826f-c4c9-48db-9dd0-5f614a3781b8","Type":"ContainerStarted","Data":"a759fef55ff82dedb6ee167ca7fe0e4bf28914840f4270d4de21022b50a4d666"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.748474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.751799 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.251779882 +0000 UTC m=+223.884012850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.766609 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" event={"ID":"8e1db38b-1243-4309-8a2e-961697fc030a","Type":"ContainerStarted","Data":"25380d7736f01f21558fcd27ebf3774c696030c9261a8b87770c7e9f1b637e9f"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.768001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" event={"ID":"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4","Type":"ContainerStarted","Data":"ef48fe86db4dc9d217c4e3e7315ca9b890be3340b12f5c0b35212eb3040478e5"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.776830 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdx94" podStartSLOduration=156.776816369 podStartE2EDuration="2m36.776816369s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.774914012 +0000 UTC m=+223.407146990" watchObservedRunningTime="2026-03-17 01:13:17.776816369 +0000 UTC m=+223.409049337" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.792786 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" event={"ID":"8839a674-43ea-4a76-a2c5-b56261bac28d","Type":"ContainerStarted","Data":"54b31c5098b1cf01e569c730ed8d39fe1a4040e5891bdd4ea05b7458f883ed5b"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.848522 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-r87ml" podStartSLOduration=8.848504127 podStartE2EDuration="8.848504127s" podCreationTimestamp="2026-03-17 01:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.832165117 +0000 UTC m=+223.464398095" watchObservedRunningTime="2026-03-17 01:13:17.848504127 +0000 UTC m=+223.480737105" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.849550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.853151 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.353119222 +0000 UTC m=+223.985352200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.886176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tn9pc" event={"ID":"dbd537a4-457f-47fc-9ed0-07044027b0d5","Type":"ContainerStarted","Data":"6c9cb3c3a402d52bace6ae65b0aff1f8e04984fdd4f868309d26c125c5c78970"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.890512 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" event={"ID":"3958599a-489b-430d-bbb4-56d7e88945d5","Type":"ContainerStarted","Data":"578a98685d9c974f4857a52b8a8a6add964fa45b115583f60a9d546ef49a67eb"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.900257 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60500: no serving certificate available for the kubelet" Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.907133 4735 generic.go:334] "Generic (PLEG): container finished" podID="211ba440-450f-4fbe-aaf8-b540716338d1" containerID="3caea269eea13498a423bbc7fefb5df72c14653301d99e9a10b38e2e662ea47f" exitCode=0 Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.907192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" event={"ID":"211ba440-450f-4fbe-aaf8-b540716338d1","Type":"ContainerDied","Data":"3caea269eea13498a423bbc7fefb5df72c14653301d99e9a10b38e2e662ea47f"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.928919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" event={"ID":"a3fe26de-7115-40ae-a335-8feafa74fb68","Type":"ContainerStarted","Data":"addf2143d7bac42ba40f2254509ace061a1dde2cae910f68919d96dbc802bc87"} Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.957013 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:17 crc kubenswrapper[4735]: E0317 01:13:17.958765 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.45872887 +0000 UTC m=+224.090961848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:17 crc kubenswrapper[4735]: I0317 01:13:17.964995 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qcs85" podStartSLOduration=156.964969677 podStartE2EDuration="2m36.964969677s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.913991718 +0000 UTC m=+223.546224696" watchObservedRunningTime="2026-03-17 01:13:17.964969677 +0000 UTC m=+223.597202655" Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.058327 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.059492 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.559471965 +0000 UTC m=+224.191704943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.078128 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r4wq7" podStartSLOduration=157.078104963 podStartE2EDuration="2m37.078104963s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:17.952616277 +0000 UTC m=+223.584849245" watchObservedRunningTime="2026-03-17 01:13:18.078104963 +0000 UTC m=+223.710337931" Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.169403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.169798 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.669783091 +0000 UTC m=+224.302016069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.281790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.282159 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.782116897 +0000 UTC m=+224.414349875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.282531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.282993 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.782977209 +0000 UTC m=+224.415210187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.387732 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.388255 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.888233748 +0000 UTC m=+224.520466726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.397026 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:18 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:18 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:18 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.397086 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.401203 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60504: no serving certificate available for the kubelet" Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.490347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.490976 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:18.990962774 +0000 UTC m=+224.623195752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.597142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.597479 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.097462604 +0000 UTC m=+224.729695582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.699566 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.700239 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.200225429 +0000 UTC m=+224.832458407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.700296 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zgxs6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.700321 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.800742 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.801001 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.300973845 +0000 UTC m=+224.933206823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.801236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.801545 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.301536609 +0000 UTC m=+224.933769587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:18 crc kubenswrapper[4735]: I0317 01:13:18.908513 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:18 crc kubenswrapper[4735]: E0317 01:13:18.909240 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.409223159 +0000 UTC m=+225.041456137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.011305 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" event={"ID":"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4","Type":"ContainerStarted","Data":"6da70f0d41f4a02c8c57f2d99bc287ea895b51960a3470f51c7dc7d162405220"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.011422 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.011929 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.511910284 +0000 UTC m=+225.144143262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.033091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"14185c3cfb0f6dc07aaa65b1bfe07a3698b546807710e225512a91b8e92e4d2b"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.058093 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" event={"ID":"8e1db38b-1243-4309-8a2e-961697fc030a","Type":"ContainerStarted","Data":"0e6e64a9baf83269836f09bc3bf71293fe58bb1faf9047705c63237a37b95235"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.100737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" event={"ID":"76c60596-d8ac-452c-bf33-b35157ee0970","Type":"ContainerStarted","Data":"64e0bc4e4dedc4759187f211f035babf97c94eeb92594961b5d4e529feb86630"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.104662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" event={"ID":"11908f1a-0399-4964-86cc-8a2e51d35821","Type":"ContainerStarted","Data":"6b3ab098070d28f6cf0bfc22e5314302e928fb58cc23c4dff861592d43c0bf6c"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.105077 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.107091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d795eb06f6ad41f66c337d9b53b59ae3df4f0bdbb1f459c1abb2383d4c588c8f"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.119526 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.120163 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.620131557 +0000 UTC m=+225.252364615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.125166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" event={"ID":"a8197b9a-a197-4e40-bebb-e8308ec1c094","Type":"ContainerStarted","Data":"1529e3a1717dc0de41364fe2dd677596a5867dec1958ddc82c1b42b5434de6b7"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.127692 4735 generic.go:334] "Generic (PLEG): container finished" podID="83a10c7e-ff3c-41b6-adb7-14243953ed5a" containerID="b930c42d9271fdc456610503a18f9f87da0673736ed22f493e7c87d5473cfa65" exitCode=0 Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.127766 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" event={"ID":"83a10c7e-ff3c-41b6-adb7-14243953ed5a","Type":"ContainerDied","Data":"b930c42d9271fdc456610503a18f9f87da0673736ed22f493e7c87d5473cfa65"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.141002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" event={"ID":"a33da5ce-b959-43e3-a18c-99c8d5e7af40","Type":"ContainerStarted","Data":"c334d2deea7e69a3096fa195bd6368ca072359b72df4f19aa62623c0236fad60"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.154570 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8rdqg" podStartSLOduration=158.154531669 podStartE2EDuration="2m38.154531669s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.140508017 +0000 UTC m=+224.772740995" watchObservedRunningTime="2026-03-17 01:13:19.154531669 +0000 UTC m=+224.786764647" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.167490 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jjfm7" event={"ID":"d47d4aa8-edf6-48a1-ae0f-850cee2dc44f","Type":"ContainerStarted","Data":"896316cd86106f906d316757d48b2cf5c07f7e8fc3cf735563c165dc5c2559b1"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.191280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" event={"ID":"125b8018-6680-4e83-b5b3-a8b177406d9d","Type":"ContainerStarted","Data":"6bc2fa015151e02b0b9069ffb7c8e0009261af239722cd398d944c7c873e83c6"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.200816 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hsd5h" podStartSLOduration=158.200795968 podStartE2EDuration="2m38.200795968s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.196110461 +0000 UTC m=+224.828343439" watchObservedRunningTime="2026-03-17 01:13:19.200795968 +0000 UTC m=+224.833028966" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.218914 4735 ???:1] "http: TLS handshake error from 192.168.126.11:47656: no serving certificate available for the kubelet" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.221924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.222206 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.722194635 +0000 UTC m=+225.354427613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.247375 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d1a97fc2a4503db98b29a0bae49e6a54f2ead14ec86d1e7ecf6b9f6fdb6ef8c5"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.271730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" event={"ID":"a3fe26de-7115-40ae-a335-8feafa74fb68","Type":"ContainerStarted","Data":"aa799d9c8cf4e6466b4829b66470e10bc4def1e5479adcf9f4abc707b6f2c647"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.323416 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.325439 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.825413022 +0000 UTC m=+225.457646000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.390697 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" podStartSLOduration=158.390680079 podStartE2EDuration="2m38.390680079s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.384113904 +0000 UTC m=+225.016346882" watchObservedRunningTime="2026-03-17 01:13:19.390680079 +0000 UTC m=+225.022913057" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.403650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" event={"ID":"5d734bc7-81d8-456d-8a09-5e138fa3ab1c","Type":"ContainerStarted","Data":"207b31dbf2a012ebf84e4469073445a09621572c81f731866e8e63fed2bbba9a"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.403719 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" event={"ID":"5d734bc7-81d8-456d-8a09-5e138fa3ab1c","Type":"ContainerStarted","Data":"70f1a76b6903c5097cd959e9e720f7bf3e322390640407ad067039589679ac23"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.405407 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:19 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:19 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:19 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.405457 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.412562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" event={"ID":"687efab0-3f01-453e-b202-46d47422c46d","Type":"ContainerStarted","Data":"5096467ee701724841b021201ff3d02041d9bb2220422f3e55fe2bab4d2a1761"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.423981 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.428128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.429148 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:19.929132824 +0000 UTC m=+225.561365802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.437029 4735 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mk2xg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.437128 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" podUID="687efab0-3f01-453e-b202-46d47422c46d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.454038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" event={"ID":"ae1e8782-bee4-4a76-837f-69e3735abc86","Type":"ContainerStarted","Data":"8ab0214c3f1d47845f86ccef40733442ee497cfb6886f3c487a2e6336cef93ad"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.454091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" event={"ID":"ae1e8782-bee4-4a76-837f-69e3735abc86","Type":"ContainerStarted","Data":"5e1dd07bac349164ec3ef5040f5bfb6bcb27ad7e31ae16f70b369067f0470820"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.535486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" event={"ID":"db479d6c-1a3c-4de1-86be-ba6806eac784","Type":"ContainerStarted","Data":"541e214bb4d1477cef439d60ad7e73f1ff0a60434b9bf1b102a7de641394e2f3"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.536687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.538109 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.038086894 +0000 UTC m=+225.670319872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.556741 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jjfm7" podStartSLOduration=10.556717752 podStartE2EDuration="10.556717752s" podCreationTimestamp="2026-03-17 01:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.554564678 +0000 UTC m=+225.186797656" watchObservedRunningTime="2026-03-17 01:13:19.556717752 +0000 UTC m=+225.188950730" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.562746 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" event={"ID":"ecb29df1-b4b4-473a-8795-564a241ce156","Type":"ContainerStarted","Data":"1bdd7bec912283d63610e88d1ee0c784263088b6b53e6bc5e8fafc5c0e84e84b"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.589798 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" event={"ID":"96e8ca98-69e8-409a-acc1-85ee705dfede","Type":"ContainerStarted","Data":"abd3d77e7604eb5aba9e582470e1d188443792f47d55edc4f0bf4879cbb57445"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.590524 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.607896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" event={"ID":"4299c484-7f23-433c-a4b9-d3daf740fa7c","Type":"ContainerStarted","Data":"8ebbc75cb10ed8d4e259342a3b0032a16bd3495119f8eedd9f5b8f630c0db55b"} Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.608607 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4rpxx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.608648 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.608716 4735 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kr886 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.608732 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" podUID="edc97b39-8677-476a-b336-1645f172c219" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.609384 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.609411 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.610300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.646892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.647420 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.147404985 +0000 UTC m=+225.779637963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.658954 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rxdbc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.659473 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podUID="4299c484-7f23-433c-a4b9-d3daf740fa7c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.751671 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.751875 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.251823363 +0000 UTC m=+225.884056361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.752679 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.753080 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.253069294 +0000 UTC m=+225.885302272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.853272 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.853698 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.353681177 +0000 UTC m=+225.985914155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.856692 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" podStartSLOduration=159.856669992 podStartE2EDuration="2m39.856669992s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.771464506 +0000 UTC m=+225.403697484" watchObservedRunningTime="2026-03-17 01:13:19.856669992 +0000 UTC m=+225.488902970" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.941243 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdv79" podStartSLOduration=158.941225081 podStartE2EDuration="2m38.941225081s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.859325638 +0000 UTC m=+225.491558616" watchObservedRunningTime="2026-03-17 01:13:19.941225081 +0000 UTC m=+225.573458059" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.956779 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:19 crc kubenswrapper[4735]: E0317 01:13:19.957329 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.457309525 +0000 UTC m=+226.089542503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.965044 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:13:19 crc kubenswrapper[4735]: I0317 01:13:19.981108 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dg6b7" podStartSLOduration=158.981091271 podStartE2EDuration="2m38.981091271s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:19.978486716 +0000 UTC m=+225.610719694" watchObservedRunningTime="2026-03-17 01:13:19.981091271 +0000 UTC m=+225.613324239" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.058850 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.059404 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.559370954 +0000 UTC m=+226.191604092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.080599 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rx6t" podStartSLOduration=159.080573655 podStartE2EDuration="2m39.080573655s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.032337505 +0000 UTC m=+225.664570483" watchObservedRunningTime="2026-03-17 01:13:20.080573655 +0000 UTC m=+225.712806633" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.108959 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" podStartSLOduration=159.108929726 podStartE2EDuration="2m39.108929726s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.083349505 +0000 UTC m=+225.715582483" watchObservedRunningTime="2026-03-17 01:13:20.108929726 +0000 UTC m=+225.741162704" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.110627 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pqxxs" podStartSLOduration=159.110620308 podStartE2EDuration="2m39.110620308s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.109637964 +0000 UTC m=+225.741870942" watchObservedRunningTime="2026-03-17 01:13:20.110620308 +0000 UTC m=+225.742853286" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.161361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.161797 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.661782461 +0000 UTC m=+226.294015439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.190406 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podStartSLOduration=159.190383348 podStartE2EDuration="2m39.190383348s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.188842489 +0000 UTC m=+225.821075467" watchObservedRunningTime="2026-03-17 01:13:20.190383348 +0000 UTC m=+225.822616326" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.262053 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.262409 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.762387763 +0000 UTC m=+226.394620741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.285791 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" podStartSLOduration=160.285773819 podStartE2EDuration="2m40.285773819s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.264486796 +0000 UTC m=+225.896719774" watchObservedRunningTime="2026-03-17 01:13:20.285773819 +0000 UTC m=+225.918006797" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.363748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.364185 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.864168104 +0000 UTC m=+226.496401082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.388927 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:20 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:20 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:20 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.389022 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.464345 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.464676 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.964601813 +0000 UTC m=+226.596834791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.464776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.465130 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:20.965113855 +0000 UTC m=+226.597346833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.566597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.566935 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.066896047 +0000 UTC m=+226.699129025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.597693 4735 ???:1] "http: TLS handshake error from 192.168.126.11:47666: no serving certificate available for the kubelet" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.641539 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"19ef32525c9d5cf32abb925148d53f4db9e135a31ac9e5270a6a1a8e79ed3864"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.667171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tn9pc" event={"ID":"dbd537a4-457f-47fc-9ed0-07044027b0d5","Type":"ContainerStarted","Data":"3bac1d391ad52c8c64f8f46d59c1baaf8a73c26d962f6a60a803fd813b37e602"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.668102 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.669293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.669707 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.169679524 +0000 UTC m=+226.801912502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.704903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" event={"ID":"ecb29df1-b4b4-473a-8795-564a241ce156","Type":"ContainerStarted","Data":"513791265fe0174173ff689877858acc76a0cd802e6c4eb81c7488279f5a8f2a"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.726981 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tn9pc" podStartSLOduration=11.726963871 podStartE2EDuration="11.726963871s" podCreationTimestamp="2026-03-17 01:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.700185699 +0000 UTC m=+226.332418677" watchObservedRunningTime="2026-03-17 01:13:20.726963871 +0000 UTC m=+226.359196849" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.728227 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-cwkhd" podStartSLOduration=159.728222011 podStartE2EDuration="2m39.728222011s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.288097328 +0000 UTC m=+225.920330306" watchObservedRunningTime="2026-03-17 01:13:20.728222011 +0000 UTC m=+226.360454989" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.738684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c5776bdab6093fcf22427545774c67ad05347ec609ef79ee1fec12cf60273bcd"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.755468 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kcqhw" podStartSLOduration=159.755443804 podStartE2EDuration="2m39.755443804s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.753464824 +0000 UTC m=+226.385697802" watchObservedRunningTime="2026-03-17 01:13:20.755443804 +0000 UTC m=+226.387676782" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.757661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" event={"ID":"748d17ea-e9d8-429c-8ed8-a94c3d6c23d4","Type":"ContainerStarted","Data":"585e0910e8d1726b2b46b5dd8b2ca4501bd1c3fcc2209e6f1a8980bb7ee51a16"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.770045 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.771788 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.271767244 +0000 UTC m=+226.904000222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.773143 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" event={"ID":"211ba440-450f-4fbe-aaf8-b540716338d1","Type":"ContainerStarted","Data":"ed4a56e66f06f1b3505faa62f077f35954fead873820f399bcc706cb98d7c1bf"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.773184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" event={"ID":"211ba440-450f-4fbe-aaf8-b540716338d1","Type":"ContainerStarted","Data":"35121513d65933d2453a19b62bfcc4a1d8ff5cb76ae6645289a97838742e5382"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.777584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" event={"ID":"83a10c7e-ff3c-41b6-adb7-14243953ed5a","Type":"ContainerStarted","Data":"8605e4c6cb80f345b25a59f45676cd2dc0f09a183255c921cea34fb3d2f4d9c3"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.788298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9fcd365766e7fb7fc3a31bf98c774a0d1c25b6ca21125dd2673385309cc0d91a"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.788958 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.798422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" event={"ID":"a039ba69-6339-4e2d-a131-2018bfcc356d","Type":"ContainerStarted","Data":"b63625e56abe0ea42d129921b9e23c387c6ea183631f384fccd881be1557d8bf"} Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.801879 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4rpxx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.801915 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.836220 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.836456 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kr886" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.873448 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.877794 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.377777091 +0000 UTC m=+227.010010069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.892621 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" podStartSLOduration=159.892599403 podStartE2EDuration="2m39.892599403s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.888454109 +0000 UTC m=+226.520687087" watchObservedRunningTime="2026-03-17 01:13:20.892599403 +0000 UTC m=+226.524832371" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.949636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2wn4z" podStartSLOduration=160.949619883 podStartE2EDuration="2m40.949619883s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:20.948349641 +0000 UTC m=+226.580582619" watchObservedRunningTime="2026-03-17 01:13:20.949619883 +0000 UTC m=+226.581852861" Mar 17 01:13:20 crc kubenswrapper[4735]: I0317 01:13:20.976237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:20 crc kubenswrapper[4735]: E0317 01:13:20.978218 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.478200659 +0000 UTC m=+227.110433637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.040235 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.060436 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-sgbsk" podStartSLOduration=160.0604139 podStartE2EDuration="2m40.0604139s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:21.059267381 +0000 UTC m=+226.691500359" watchObservedRunningTime="2026-03-17 01:13:21.0604139 +0000 UTC m=+226.692646878" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.061053 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" podStartSLOduration=161.061048706 podStartE2EDuration="2m41.061048706s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:21.018250493 +0000 UTC m=+226.650483471" watchObservedRunningTime="2026-03-17 01:13:21.061048706 +0000 UTC m=+226.693281684" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.078373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.078711 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.578698458 +0000 UTC m=+227.210931426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.179672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.181649 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.681594728 +0000 UTC m=+227.313827776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.296010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.296532 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.796518039 +0000 UTC m=+227.428751017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.346160 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.346194 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.368054 4735 patch_prober.go:28] interesting pod/console-f9d7485db-nh28b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.368119 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nh28b" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.399106 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:21 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:21 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:21 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.399168 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.399483 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.399743 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:21.899723496 +0000 UTC m=+227.531956474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.467922 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.500680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.501708 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.001690373 +0000 UTC m=+227.633923351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.566945 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.567186 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" podUID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" containerName="route-controller-manager" containerID="cri-o://675847c0d98a131d9338742cc981e2d29a2ca77cd6f4526b8d802c6ba8eef8e2" gracePeriod=30 Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.601504 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.601916 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.101897824 +0000 UTC m=+227.734130812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.703261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.703667 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.203649966 +0000 UTC m=+227.835882944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.799367 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rxdbc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.799474 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podUID="4299c484-7f23-433c-a4b9-d3daf740fa7c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.804185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.804539 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.304521614 +0000 UTC m=+227.936754592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.817736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" event={"ID":"a8197b9a-a197-4e40-bebb-e8308ec1c094","Type":"ContainerStarted","Data":"7df48ecc267bbae77212b8d12a62615cebd7bb3e94971bfef012cfd82f3c24c2"} Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.826979 4735 generic.go:334] "Generic (PLEG): container finished" podID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" containerID="675847c0d98a131d9338742cc981e2d29a2ca77cd6f4526b8d802c6ba8eef8e2" exitCode=0 Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.827165 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" event={"ID":"f326c014-e2bb-4d5a-a0ac-580a61a041f0","Type":"ContainerDied","Data":"675847c0d98a131d9338742cc981e2d29a2ca77cd6f4526b8d802c6ba8eef8e2"} Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.830111 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" containerID="cri-o://4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8" gracePeriod=30 Mar 17 01:13:21 crc kubenswrapper[4735]: I0317 01:13:21.906882 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:21 crc kubenswrapper[4735]: E0317 01:13:21.907685 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.407669861 +0000 UTC m=+228.039902839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.008528 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.009022 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.508991601 +0000 UTC m=+228.141224579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.009177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.009491 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.509479153 +0000 UTC m=+228.141712131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.110536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.110805 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.610790253 +0000 UTC m=+228.243023221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.211487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.211917 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.711896198 +0000 UTC m=+228.344129186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.285951 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.312736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.313063 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.813048344 +0000 UTC m=+228.445281322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.381776 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.382557 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" containerName="route-controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.382572 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" containerName="route-controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.382706 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" containerName="route-controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.383605 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.402564 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:22 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:22 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:22 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.402654 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.402824 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.414055 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca\") pod \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.414130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config\") pod \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.414172 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert\") pod \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.414394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cx2c\" (UniqueName: \"kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c\") pod \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\" (UID: \"f326c014-e2bb-4d5a-a0ac-580a61a041f0\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.414543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.414899 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:22.914886657 +0000 UTC m=+228.547119635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.415471 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config" (OuterVolumeSpecName: "config") pod "f326c014-e2bb-4d5a-a0ac-580a61a041f0" (UID: "f326c014-e2bb-4d5a-a0ac-580a61a041f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.419506 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f326c014-e2bb-4d5a-a0ac-580a61a041f0" (UID: "f326c014-e2bb-4d5a-a0ac-580a61a041f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.428123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c" (OuterVolumeSpecName: "kube-api-access-9cx2c") pod "f326c014-e2bb-4d5a-a0ac-580a61a041f0" (UID: "f326c014-e2bb-4d5a-a0ac-580a61a041f0"). InnerVolumeSpecName "kube-api-access-9cx2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.457289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f326c014-e2bb-4d5a-a0ac-580a61a041f0" (UID: "f326c014-e2bb-4d5a-a0ac-580a61a041f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.499206 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fdtd\" (UniqueName: \"kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520735 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cx2c\" (UniqueName: \"kubernetes.io/projected/f326c014-e2bb-4d5a-a0ac-580a61a041f0-kube-api-access-9cx2c\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520745 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520754 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f326c014-e2bb-4d5a-a0ac-580a61a041f0-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.520763 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f326c014-e2bb-4d5a-a0ac-580a61a041f0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.520839 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.020825093 +0000 UTC m=+228.653058071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.597398 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.597456 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.597645 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.597845 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.597889 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.598152 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.599405 4735 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-nr62z container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.599435 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" podUID="96e8ca98-69e8-409a-acc1-85ee705dfede" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.602455 4735 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-nr62z container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.602529 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" podUID="96e8ca98-69e8-409a-acc1-85ee705dfede" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.622740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fdtd\" (UniqueName: \"kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.622819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.622887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.622912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.624814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.625465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.625699 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.125680502 +0000 UTC m=+228.757913680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.721781 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fdtd\" (UniqueName: \"kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd\") pod \"certified-operators-mxnnp\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.724052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.724229 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.224216502 +0000 UTC m=+228.856449480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.724398 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.724671 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.224663303 +0000 UTC m=+228.856896281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.738375 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.788417 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.789924 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.790107 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.790122 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.790199 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerName="controller-manager" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.790795 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.828008 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rxdbc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.828070 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podUID="4299c484-7f23-433c-a4b9-d3daf740fa7c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.828394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.828754 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.328735713 +0000 UTC m=+228.960968691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.842118 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.842822 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.872239 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.875801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" event={"ID":"a8197b9a-a197-4e40-bebb-e8308ec1c094","Type":"ContainerStarted","Data":"46590f32981545cb9c495e564c5c1248e323e3d9a54409c1e8c3a0726d7f027f"} Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.888697 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.893130 4735 generic.go:334] "Generic (PLEG): container finished" podID="9cb55cbc-4698-4a34-aa2d-bead443c0784" containerID="4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8" exitCode=0 Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.893246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" event={"ID":"9cb55cbc-4698-4a34-aa2d-bead443c0784","Type":"ContainerDied","Data":"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8"} Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.893278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" event={"ID":"9cb55cbc-4698-4a34-aa2d-bead443c0784","Type":"ContainerDied","Data":"22e250f42397c59b47ad967340d7a30ee2e8d444b3ef1c9148e28a2f710ebc27"} Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.893297 4735 scope.go:117] "RemoveContainer" containerID="4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.893417 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jsnf" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.909872 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.916977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr" event={"ID":"f326c014-e2bb-4d5a-a0ac-580a61a041f0","Type":"ContainerDied","Data":"a13d6b3021951b73ef13a57bf03fd9ce909c18c3d5732c236fdd715420564012"} Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.940917 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles\") pod \"9cb55cbc-4698-4a34-aa2d-bead443c0784\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.940976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert\") pod \"9cb55cbc-4698-4a34-aa2d-bead443c0784\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca\") pod \"9cb55cbc-4698-4a34-aa2d-bead443c0784\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7dp\" (UniqueName: \"kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp\") pod \"9cb55cbc-4698-4a34-aa2d-bead443c0784\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941088 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config\") pod \"9cb55cbc-4698-4a34-aa2d-bead443c0784\" (UID: \"9cb55cbc-4698-4a34-aa2d-bead443c0784\") " Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941290 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941305 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdcck\" (UniqueName: \"kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4vw\" (UniqueName: \"kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941360 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.941393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.962096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9cb55cbc-4698-4a34-aa2d-bead443c0784" (UID: "9cb55cbc-4698-4a34-aa2d-bead443c0784"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: E0317 01:13:22.965125 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.465107751 +0000 UTC m=+229.097340719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.965458 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca" (OuterVolumeSpecName: "client-ca") pod "9cb55cbc-4698-4a34-aa2d-bead443c0784" (UID: "9cb55cbc-4698-4a34-aa2d-bead443c0784"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.990292 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.991037 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp" (OuterVolumeSpecName: "kube-api-access-xm7dp") pod "9cb55cbc-4698-4a34-aa2d-bead443c0784" (UID: "9cb55cbc-4698-4a34-aa2d-bead443c0784"). InnerVolumeSpecName "kube-api-access-xm7dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.997130 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config" (OuterVolumeSpecName: "config") pod "9cb55cbc-4698-4a34-aa2d-bead443c0784" (UID: "9cb55cbc-4698-4a34-aa2d-bead443c0784"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:22 crc kubenswrapper[4735]: I0317 01:13:22.998497 4735 scope.go:117] "RemoveContainer" containerID="4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.005844 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8\": container with ID starting with 4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8 not found: ID does not exist" containerID="4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.005906 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8"} err="failed to get container status \"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8\": rpc error: code = NotFound desc = could not find container \"4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8\": container with ID starting with 4c415ed9284f9eab6d48e6a5479cbfd0d6d3fa780790c22caf28c54a61b092f8 not found: ID does not exist" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.005937 4735 scope.go:117] "RemoveContainer" containerID="675847c0d98a131d9338742cc981e2d29a2ca77cd6f4526b8d802c6ba8eef8e2" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.013809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.013847 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.015003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.015570 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9cb55cbc-4698-4a34-aa2d-bead443c0784" (UID: "9cb55cbc-4698-4a34-aa2d-bead443c0784"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.027166 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.040586 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.058993 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059508 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059528 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdcck\" (UniqueName: \"kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4vw\" (UniqueName: \"kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059623 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.059644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.066613 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.067012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.067925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.068439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.070173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.070221 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb55cbc-4698-4a34-aa2d-bead443c0784-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.070273 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.570253927 +0000 UTC m=+229.202486905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.070294 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.070312 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7dp\" (UniqueName: \"kubernetes.io/projected/9cb55cbc-4698-4a34-aa2d-bead443c0784-kube-api-access-xm7dp\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.070322 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb55cbc-4698-4a34-aa2d-bead443c0784-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.078701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.098184 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.099170 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.100065 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.123980 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4vw\" (UniqueName: \"kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw\") pod \"certified-operators-s9pzj\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.128602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdcck\" (UniqueName: \"kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck\") pod \"route-controller-manager-c9678b8df-fz5vw\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.137338 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.169951 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.170224 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.171122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7cdt\" (UniqueName: \"kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.171184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.171229 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.171276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.176413 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.676388588 +0000 UTC m=+229.308621566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.176624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.183056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.193260 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hwnhr"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275476 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7cdt\" (UniqueName: \"kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.275828 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h67q9\" (UniqueName: \"kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.275936 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.775919494 +0000 UTC m=+229.408152472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.276295 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.277242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.289595 4735 ???:1] "http: TLS handshake error from 192.168.126.11:47670: no serving certificate available for the kubelet" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.337844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7cdt\" (UniqueName: \"kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt\") pod \"community-operators-db76h\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.364087 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.379529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.379937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.379968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.380026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h67q9\" (UniqueName: \"kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.380689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.381005 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:23.880995588 +0000 UTC m=+229.513228566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.381529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.387075 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.410148 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:23 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:23 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:23 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.410207 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.448480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h67q9\" (UniqueName: \"kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9\") pod \"community-operators-bnkg8\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.462901 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.474716 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.474771 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jsnf"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.477218 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.500212 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.500415 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.000394281 +0000 UTC m=+229.632627259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.500543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.502169 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.002157066 +0000 UTC m=+229.634390044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.529235 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr62z" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.564804 4735 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.603503 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.604558 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.103597939 +0000 UTC m=+229.735830917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.604652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.605246 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.105225119 +0000 UTC m=+229.737458087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.706385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.706756 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.206730215 +0000 UTC m=+229.838963193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.807357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.807691 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.307677615 +0000 UTC m=+229.939910593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d9dqr" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.911981 4735 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-17T01:13:23.564829187Z","Handler":null,"Name":""} Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.913398 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:23 crc kubenswrapper[4735]: E0317 01:13:23.913916 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 01:13:24.413901468 +0000 UTC m=+230.046134446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.919746 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.920609 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.924208 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.924460 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.938796 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.939759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.951630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.951913 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.952077 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.952204 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.952389 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.957374 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.958805 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.982049 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.982516 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.996434 4735 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.996498 4735 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 17 01:13:23 crc kubenswrapper[4735]: I0317 01:13:23.997471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" event={"ID":"a8197b9a-a197-4e40-bebb-e8308ec1c094","Type":"ContainerStarted","Data":"01f0f14a3bba2e3a325a8926fe48f68ad537bad961d7218ccc41aa9352026aab"} Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.035594 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.042973 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5grcp" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.063722 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.101430 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.101499 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.140811 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.140903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b55wn\" (UniqueName: \"kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.140955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.140986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.141024 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.141067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.141083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: W0317 01:13:24.192775 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0335b42_7be7_4b74_b722_0697f25b6252.slice/crio-017f36f13035c501722f546ea96f2837a9566b541b99ca95fa7e4ca5690a1510 WatchSource:0}: Error finding container 017f36f13035c501722f546ea96f2837a9566b541b99ca95fa7e4ca5690a1510: Status 404 returned error can't find the container with id 017f36f13035c501722f546ea96f2837a9566b541b99ca95fa7e4ca5690a1510 Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.194170 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vwwj9" podStartSLOduration=15.194148894 podStartE2EDuration="15.194148894s" podCreationTimestamp="2026-03-17 01:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:24.114452166 +0000 UTC m=+229.746685144" watchObservedRunningTime="2026-03-17 01:13:24.194148894 +0000 UTC m=+229.826381872" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.195453 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248687 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b55wn\" (UniqueName: \"kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.248930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.249045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.250650 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.250735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.251649 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.263106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.303793 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b55wn\" (UniqueName: \"kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn\") pod \"controller-manager-585487db7-xr8mx\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.310939 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.323244 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.388052 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:24 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:24 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:24 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.388113 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.453456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d9dqr\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.516563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:13:24 crc kubenswrapper[4735]: W0317 01:13:24.543839 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d63aa4e_027e_4486_8b30_b7b583c57b3a.slice/crio-992fcb6bdcebe3b48efe5b88cd4781a9d3ec92fbfe97fc3346b5f09aaf59f6a3 WatchSource:0}: Error finding container 992fcb6bdcebe3b48efe5b88cd4781a9d3ec92fbfe97fc3346b5f09aaf59f6a3: Status 404 returned error can't find the container with id 992fcb6bdcebe3b48efe5b88cd4781a9d3ec92fbfe97fc3346b5f09aaf59f6a3 Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.556565 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.560950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.578696 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.653979 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.656574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.658569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.658728 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8fr\" (UniqueName: \"kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.658804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.659240 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.668752 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.684786 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.700424 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.761780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d8fr\" (UniqueName: \"kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.762159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.762246 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.763136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.765696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.803668 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d8fr\" (UniqueName: \"kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr\") pod \"redhat-marketplace-d4h2p\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.819455 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.832216 4735 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qb49c container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]log ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]etcd ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/generic-apiserver-start-informers ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/max-in-flight-filter ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 17 01:13:24 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 17 01:13:24 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/project.openshift.io-projectcache ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-startinformers ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 17 01:13:24 crc kubenswrapper[4735]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 17 01:13:24 crc kubenswrapper[4735]: livez check failed Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.832271 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" podUID="211ba440-450f-4fbe-aaf8-b540716338d1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:24 crc kubenswrapper[4735]: I0317 01:13:24.981385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.034192 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.035593 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.062112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" event={"ID":"e0335b42-7be7-4b74-b722-0697f25b6252","Type":"ContainerStarted","Data":"3d6ddca9e97bf584bdb41585bffd9ce065c6cf145151aee229b5298b3cdb3f58"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.062166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" event={"ID":"e0335b42-7be7-4b74-b722-0697f25b6252","Type":"ContainerStarted","Data":"017f36f13035c501722f546ea96f2837a9566b541b99ca95fa7e4ca5690a1510"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.062597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.063067 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.066131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerStarted","Data":"1057dbe8711cf87900956da3f54d1a037225b396d4ac3380ecdaa2be9366a11a"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.074392 4735 generic.go:334] "Generic (PLEG): container finished" podID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerID="5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78" exitCode=0 Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.088919 4735 generic.go:334] "Generic (PLEG): container finished" podID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerID="ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f" exitCode=0 Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.096961 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.097599 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb55cbc-4698-4a34-aa2d-bead443c0784" path="/var/lib/kubelet/pods/9cb55cbc-4698-4a34-aa2d-bead443c0784/volumes" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.109244 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f326c014-e2bb-4d5a-a0ac-580a61a041f0" path="/var/lib/kubelet/pods/f326c014-e2bb-4d5a-a0ac-580a61a041f0/volumes" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.109896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerDied","Data":"5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.109932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerStarted","Data":"ad46de335b6e558843ac26e8849df144da9ea070c70df84c806e37f48d628498"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.109954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerDied","Data":"ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.109966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerStarted","Data":"2a14b9e26e66a1a58fe3e6ebcb93c5d2c110c37e3c59ebaaaf62453f43c45caf"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.122094 4735 generic.go:334] "Generic (PLEG): container finished" podID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerID="539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9" exitCode=0 Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.122171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerDied","Data":"539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.122226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerStarted","Data":"992fcb6bdcebe3b48efe5b88cd4781a9d3ec92fbfe97fc3346b5f09aaf59f6a3"} Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.184593 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.191150 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfcd\" (UniqueName: \"kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.191196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.191271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.292757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfcd\" (UniqueName: \"kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.293349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.293560 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.299366 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.300281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.301197 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" podStartSLOduration=3.301158516 podStartE2EDuration="3.301158516s" podCreationTimestamp="2026-03-17 01:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:25.29210306 +0000 UTC m=+230.924336038" watchObservedRunningTime="2026-03-17 01:13:25.301158516 +0000 UTC m=+230.933391494" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.381245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfcd\" (UniqueName: \"kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd\") pod \"redhat-marketplace-wd6n6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.401451 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.411267 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:25 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:25 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:25 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.411335 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.628252 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.629574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.638126 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.656798 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.679520 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.807109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.807446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.807483 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdhh\" (UniqueName: \"kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.908970 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.909054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.909118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdhh\" (UniqueName: \"kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.910611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.911016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.944816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdhh\" (UniqueName: \"kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh\") pod \"redhat-operators-46vft\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.980869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:13:25 crc kubenswrapper[4735]: I0317 01:13:25.984253 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.021482 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.026232 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.047353 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.048355 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.071950 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.083055 4735 ???:1] "http: TLS handshake error from 192.168.126.11:47672: no serving certificate available for the kubelet" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.114915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.114965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhk75\" (UniqueName: \"kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.115030 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.233483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.233836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.233878 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhk75\" (UniqueName: \"kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.236049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.236783 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.242215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" event={"ID":"85e67779-627e-4c7b-8105-8bb93f10ec15","Type":"ContainerStarted","Data":"32c1b4483dfcc924cde4b3e3ca2301718aec8ec553f7170f3f0c4464272d9368"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.270706 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhk75\" (UniqueName: \"kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75\") pod \"redhat-operators-5tdwt\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.298640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" event={"ID":"2189d27a-054a-4ce5-8548-ecc25f1c0b33","Type":"ContainerStarted","Data":"e1e5c5278b3bd821525ee8ef2659a6e7b5fa5f02eed7b5e0abe452f67902cc80"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.298693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" event={"ID":"2189d27a-054a-4ce5-8548-ecc25f1c0b33","Type":"ContainerStarted","Data":"5906cdb2b1a62366590457acb4857ff50003c18a8a2408ab8f9dd364e5c94921"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.299085 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.310805 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.329662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1dc95b73-5e1d-45e9-9708-42d581674cc2","Type":"ContainerStarted","Data":"a6b9c6b5bbdd6e82c979a1790fa5ab5c47b85890a41ff8e43f8b9a656a77409c"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.351133 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerID="ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51" exitCode=0 Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.351259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerDied","Data":"ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.378256 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.383136 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" podStartSLOduration=4.383117931 podStartE2EDuration="4.383117931s" podCreationTimestamp="2026-03-17 01:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:26.344133714 +0000 UTC m=+231.976366692" watchObservedRunningTime="2026-03-17 01:13:26.383117931 +0000 UTC m=+232.015350899" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.390660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerStarted","Data":"ef91125e4d65a5ed53caa166d91c694e045d2c84a1cffd70ac531a9437490336"} Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.394322 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:26 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:26 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:26 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.394399 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.433417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.820419 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:13:26 crc kubenswrapper[4735]: I0317 01:13:26.967738 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:13:27 crc kubenswrapper[4735]: E0317 01:13:27.096722 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod1dc95b73_5e1d_45e9_9708_42d581674cc2.slice/crio-b727a6e23f4596848341abbfbbd1ba88a2151040be2e9824e843eaae3451c298.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.397061 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:27 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:27 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:27 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.397694 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.449059 4735 generic.go:334] "Generic (PLEG): container finished" podID="9fb14281-df17-4753-ba53-e292dcb071fa" containerID="f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.449688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerDied","Data":"f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.449853 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerStarted","Data":"16564cd19dc9fea5dd6eba7f0246720e7f76946946ba876d74348dfd05b5acb5"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.473798 4735 generic.go:334] "Generic (PLEG): container finished" podID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerID="bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.473898 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerDied","Data":"bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.475337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerStarted","Data":"842f025fb6c818013c1fcbffedfb5458c360fcb92cdd6a705fa061eecd375a8a"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.487768 4735 generic.go:334] "Generic (PLEG): container finished" podID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerID="76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.487885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerDied","Data":"76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.505265 4735 generic.go:334] "Generic (PLEG): container finished" podID="a3fe26de-7115-40ae-a335-8feafa74fb68" containerID="aa799d9c8cf4e6466b4829b66470e10bc4def1e5479adcf9f4abc707b6f2c647" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.505346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" event={"ID":"a3fe26de-7115-40ae-a335-8feafa74fb68","Type":"ContainerDied","Data":"aa799d9c8cf4e6466b4829b66470e10bc4def1e5479adcf9f4abc707b6f2c647"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.507981 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" event={"ID":"85e67779-627e-4c7b-8105-8bb93f10ec15","Type":"ContainerStarted","Data":"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.508585 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.543232 4735 generic.go:334] "Generic (PLEG): container finished" podID="1dc95b73-5e1d-45e9-9708-42d581674cc2" containerID="b727a6e23f4596848341abbfbbd1ba88a2151040be2e9824e843eaae3451c298" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.543339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1dc95b73-5e1d-45e9-9708-42d581674cc2","Type":"ContainerDied","Data":"b727a6e23f4596848341abbfbbd1ba88a2151040be2e9824e843eaae3451c298"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.550591 4735 generic.go:334] "Generic (PLEG): container finished" podID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerID="a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.551649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerDied","Data":"a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.551672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerStarted","Data":"505f2b91f5294f96003a74321b722add82fdbc4a6368f5da9b2282f45e045e72"} Mar 17 01:13:27 crc kubenswrapper[4735]: I0317 01:13:27.564810 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" podStartSLOduration=166.564783516 podStartE2EDuration="2m46.564783516s" podCreationTimestamp="2026-03-17 01:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:13:27.554797746 +0000 UTC m=+233.187030724" watchObservedRunningTime="2026-03-17 01:13:27.564783516 +0000 UTC m=+233.197016494" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.010145 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.021035 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qb49c" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.251756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tn9pc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.306091 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.306765 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.329483 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.329795 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.341321 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.399053 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:28 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:28 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:28 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.399449 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.471537 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.471583 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.507706 4735 ???:1] "http: TLS handshake error from 192.168.126.11:60848: no serving certificate available for the kubelet" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.573163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.573203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.579047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.607363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:28 crc kubenswrapper[4735]: I0317 01:13:28.653641 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.042590 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.065773 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.199572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3fe26de-7115-40ae-a335-8feafa74fb68" (UID: "a3fe26de-7115-40ae-a335-8feafa74fb68"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.199695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume\") pod \"a3fe26de-7115-40ae-a335-8feafa74fb68\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.199804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume\") pod \"a3fe26de-7115-40ae-a335-8feafa74fb68\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.199842 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access\") pod \"1dc95b73-5e1d-45e9-9708-42d581674cc2\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.201101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whpmn\" (UniqueName: \"kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn\") pod \"a3fe26de-7115-40ae-a335-8feafa74fb68\" (UID: \"a3fe26de-7115-40ae-a335-8feafa74fb68\") " Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.201270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir\") pod \"1dc95b73-5e1d-45e9-9708-42d581674cc2\" (UID: \"1dc95b73-5e1d-45e9-9708-42d581674cc2\") " Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.201839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1dc95b73-5e1d-45e9-9708-42d581674cc2" (UID: "1dc95b73-5e1d-45e9-9708-42d581674cc2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.202509 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc95b73-5e1d-45e9-9708-42d581674cc2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.202533 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3fe26de-7115-40ae-a335-8feafa74fb68-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.204967 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1dc95b73-5e1d-45e9-9708-42d581674cc2" (UID: "1dc95b73-5e1d-45e9-9708-42d581674cc2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.205283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3fe26de-7115-40ae-a335-8feafa74fb68" (UID: "a3fe26de-7115-40ae-a335-8feafa74fb68"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.205765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn" (OuterVolumeSpecName: "kube-api-access-whpmn") pod "a3fe26de-7115-40ae-a335-8feafa74fb68" (UID: "a3fe26de-7115-40ae-a335-8feafa74fb68"). InnerVolumeSpecName "kube-api-access-whpmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.303983 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whpmn\" (UniqueName: \"kubernetes.io/projected/a3fe26de-7115-40ae-a335-8feafa74fb68-kube-api-access-whpmn\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.304011 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3fe26de-7115-40ae-a335-8feafa74fb68-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.304022 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dc95b73-5e1d-45e9-9708-42d581674cc2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.391074 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:29 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:29 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:29 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.391126 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.591223 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.595081 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78" event={"ID":"a3fe26de-7115-40ae-a335-8feafa74fb68","Type":"ContainerDied","Data":"addf2143d7bac42ba40f2254509ace061a1dde2cae910f68919d96dbc802bc87"} Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.595131 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="addf2143d7bac42ba40f2254509ace061a1dde2cae910f68919d96dbc802bc87" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.602031 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.602308 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1dc95b73-5e1d-45e9-9708-42d581674cc2","Type":"ContainerDied","Data":"a6b9c6b5bbdd6e82c979a1790fa5ab5c47b85890a41ff8e43f8b9a656a77409c"} Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.602386 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6b9c6b5bbdd6e82c979a1790fa5ab5c47b85890a41ff8e43f8b9a656a77409c" Mar 17 01:13:29 crc kubenswrapper[4735]: I0317 01:13:29.662942 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 01:13:30 crc kubenswrapper[4735]: I0317 01:13:30.390180 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:30 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:30 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:30 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:30 crc kubenswrapper[4735]: I0317 01:13:30.390235 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:30 crc kubenswrapper[4735]: I0317 01:13:30.642115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd10bbf1-2eff-4582-ba53-336cb1470f8f","Type":"ContainerStarted","Data":"47daadba96d1e6a81725f4e01199c46680393a245cb3c8cffd07236f81abb931"} Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.347947 4735 patch_prober.go:28] interesting pod/console-f9d7485db-nh28b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.348350 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nh28b" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.388964 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:31 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:31 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:31 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.389012 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.688494 4735 generic.go:334] "Generic (PLEG): container finished" podID="bd10bbf1-2eff-4582-ba53-336cb1470f8f" containerID="ba93b545367afa69d49781adeb7dc69a9f04c320336787dd547cd08a15dd8090" exitCode=0 Mar 17 01:13:31 crc kubenswrapper[4735]: I0317 01:13:31.688543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd10bbf1-2eff-4582-ba53-336cb1470f8f","Type":"ContainerDied","Data":"ba93b545367afa69d49781adeb7dc69a9f04c320336787dd547cd08a15dd8090"} Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.389515 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:32 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:32 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:32 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.389587 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.598176 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.598764 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.598659 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-lsrzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 01:13:32 crc kubenswrapper[4735]: I0317 01:13:32.599283 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lsrzv" podUID="60ccc3db-c677-47ff-8113-5eb103809a4a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.126071 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.313393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access\") pod \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.313692 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir\") pod \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\" (UID: \"bd10bbf1-2eff-4582-ba53-336cb1470f8f\") " Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.313923 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd10bbf1-2eff-4582-ba53-336cb1470f8f" (UID: "bd10bbf1-2eff-4582-ba53-336cb1470f8f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.314053 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.320485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd10bbf1-2eff-4582-ba53-336cb1470f8f" (UID: "bd10bbf1-2eff-4582-ba53-336cb1470f8f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.388508 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:33 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:33 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:33 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.388600 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.414690 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd10bbf1-2eff-4582-ba53-336cb1470f8f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.804303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bd10bbf1-2eff-4582-ba53-336cb1470f8f","Type":"ContainerDied","Data":"47daadba96d1e6a81725f4e01199c46680393a245cb3c8cffd07236f81abb931"} Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.804606 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47daadba96d1e6a81725f4e01199c46680393a245cb3c8cffd07236f81abb931" Mar 17 01:13:33 crc kubenswrapper[4735]: I0317 01:13:33.804682 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 01:13:34 crc kubenswrapper[4735]: I0317 01:13:34.130024 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:34 crc kubenswrapper[4735]: I0317 01:13:34.151386 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a72fe2c-32fb-4360-882b-44debb825c9e-metrics-certs\") pod \"network-metrics-daemon-dkwf5\" (UID: \"3a72fe2c-32fb-4360-882b-44debb825c9e\") " pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:34 crc kubenswrapper[4735]: I0317 01:13:34.395277 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:34 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:34 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:34 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:34 crc kubenswrapper[4735]: I0317 01:13:34.395351 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:34 crc kubenswrapper[4735]: I0317 01:13:34.397641 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkwf5" Mar 17 01:13:35 crc kubenswrapper[4735]: I0317 01:13:35.119489 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dkwf5"] Mar 17 01:13:35 crc kubenswrapper[4735]: I0317 01:13:35.389992 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 01:13:35 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Mar 17 01:13:35 crc kubenswrapper[4735]: [+]process-running ok Mar 17 01:13:35 crc kubenswrapper[4735]: healthz check failed Mar 17 01:13:35 crc kubenswrapper[4735]: I0317 01:13:35.390314 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 01:13:35 crc kubenswrapper[4735]: I0317 01:13:35.898182 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" event={"ID":"3a72fe2c-32fb-4360-882b-44debb825c9e","Type":"ContainerStarted","Data":"ae1434c87df15883c63851343e99c5efca1d924d5590383864d0b31904ad23ff"} Mar 17 01:13:36 crc kubenswrapper[4735]: I0317 01:13:36.397794 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:36 crc kubenswrapper[4735]: I0317 01:13:36.404820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-s2dvf" Mar 17 01:13:39 crc kubenswrapper[4735]: I0317 01:13:39.529487 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:39 crc kubenswrapper[4735]: I0317 01:13:39.530087 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerName="controller-manager" containerID="cri-o://e1e5c5278b3bd821525ee8ef2659a6e7b5fa5f02eed7b5e0abe452f67902cc80" gracePeriod=30 Mar 17 01:13:39 crc kubenswrapper[4735]: I0317 01:13:39.558182 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:39 crc kubenswrapper[4735]: I0317 01:13:39.558414 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" containerName="route-controller-manager" containerID="cri-o://3d6ddca9e97bf584bdb41585bffd9ce065c6cf145151aee229b5298b3cdb3f58" gracePeriod=30 Mar 17 01:13:40 crc kubenswrapper[4735]: I0317 01:13:40.976006 4735 generic.go:334] "Generic (PLEG): container finished" podID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerID="e1e5c5278b3bd821525ee8ef2659a6e7b5fa5f02eed7b5e0abe452f67902cc80" exitCode=0 Mar 17 01:13:40 crc kubenswrapper[4735]: I0317 01:13:40.976502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" event={"ID":"2189d27a-054a-4ce5-8548-ecc25f1c0b33","Type":"ContainerDied","Data":"e1e5c5278b3bd821525ee8ef2659a6e7b5fa5f02eed7b5e0abe452f67902cc80"} Mar 17 01:13:40 crc kubenswrapper[4735]: I0317 01:13:40.982067 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0335b42-7be7-4b74-b722-0697f25b6252" containerID="3d6ddca9e97bf584bdb41585bffd9ce065c6cf145151aee229b5298b3cdb3f58" exitCode=0 Mar 17 01:13:40 crc kubenswrapper[4735]: I0317 01:13:40.982092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" event={"ID":"e0335b42-7be7-4b74-b722-0697f25b6252","Type":"ContainerDied","Data":"3d6ddca9e97bf584bdb41585bffd9ce065c6cf145151aee229b5298b3cdb3f58"} Mar 17 01:13:41 crc kubenswrapper[4735]: I0317 01:13:41.365406 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:41 crc kubenswrapper[4735]: I0317 01:13:41.369892 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:13:42 crc kubenswrapper[4735]: I0317 01:13:42.606368 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:13:42 crc kubenswrapper[4735]: I0317 01:13:42.607114 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:13:42 crc kubenswrapper[4735]: I0317 01:13:42.609523 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lsrzv" Mar 17 01:13:43 crc kubenswrapper[4735]: I0317 01:13:43.186699 4735 patch_prober.go:28] interesting pod/route-controller-manager-c9678b8df-fz5vw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 17 01:13:43 crc kubenswrapper[4735]: I0317 01:13:43.186760 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 17 01:13:44 crc kubenswrapper[4735]: I0317 01:13:44.580111 4735 patch_prober.go:28] interesting pod/controller-manager-585487db7-xr8mx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 17 01:13:44 crc kubenswrapper[4735]: I0317 01:13:44.580192 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 17 01:13:44 crc kubenswrapper[4735]: I0317 01:13:44.706747 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.019709 4735 ???:1] "http: TLS handshake error from 192.168.126.11:34096: no serving certificate available for the kubelet" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.814658 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.849944 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:13:49 crc kubenswrapper[4735]: E0317 01:13:49.850145 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc95b73-5e1d-45e9-9708-42d581674cc2" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850156 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc95b73-5e1d-45e9-9708-42d581674cc2" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: E0317 01:13:49.850173 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" containerName="route-controller-manager" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850179 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" containerName="route-controller-manager" Mar 17 01:13:49 crc kubenswrapper[4735]: E0317 01:13:49.850188 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd10bbf1-2eff-4582-ba53-336cb1470f8f" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850194 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd10bbf1-2eff-4582-ba53-336cb1470f8f" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: E0317 01:13:49.850202 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fe26de-7115-40ae-a335-8feafa74fb68" containerName="collect-profiles" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850208 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fe26de-7115-40ae-a335-8feafa74fb68" containerName="collect-profiles" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850317 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc95b73-5e1d-45e9-9708-42d581674cc2" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850325 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fe26de-7115-40ae-a335-8feafa74fb68" containerName="collect-profiles" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850333 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd10bbf1-2eff-4582-ba53-336cb1470f8f" containerName="pruner" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850344 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" containerName="route-controller-manager" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.850707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.872479 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.916109 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert\") pod \"e0335b42-7be7-4b74-b722-0697f25b6252\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.916186 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca\") pod \"e0335b42-7be7-4b74-b722-0697f25b6252\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.916217 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdcck\" (UniqueName: \"kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck\") pod \"e0335b42-7be7-4b74-b722-0697f25b6252\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.916261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config\") pod \"e0335b42-7be7-4b74-b722-0697f25b6252\" (UID: \"e0335b42-7be7-4b74-b722-0697f25b6252\") " Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.917122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0335b42-7be7-4b74-b722-0697f25b6252" (UID: "e0335b42-7be7-4b74-b722-0697f25b6252"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.917177 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config" (OuterVolumeSpecName: "config") pod "e0335b42-7be7-4b74-b722-0697f25b6252" (UID: "e0335b42-7be7-4b74-b722-0697f25b6252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.925009 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck" (OuterVolumeSpecName: "kube-api-access-kdcck") pod "e0335b42-7be7-4b74-b722-0697f25b6252" (UID: "e0335b42-7be7-4b74-b722-0697f25b6252"). InnerVolumeSpecName "kube-api-access-kdcck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4735]: I0317 01:13:49.943162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0335b42-7be7-4b74-b722-0697f25b6252" (UID: "e0335b42-7be7-4b74-b722-0697f25b6252"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.017829 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.017907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018385 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkx46\" (UniqueName: \"kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018802 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0335b42-7be7-4b74-b722-0697f25b6252-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018835 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018911 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdcck\" (UniqueName: \"kubernetes.io/projected/e0335b42-7be7-4b74-b722-0697f25b6252-kube-api-access-kdcck\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.018931 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0335b42-7be7-4b74-b722-0697f25b6252-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.101012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" event={"ID":"3a72fe2c-32fb-4360-882b-44debb825c9e","Type":"ContainerStarted","Data":"49f11b8495c6a9afd0f00bfa26081cb284deef17f0001cd0819329d4e17ec0cf"} Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.103984 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" event={"ID":"e0335b42-7be7-4b74-b722-0697f25b6252","Type":"ContainerDied","Data":"017f36f13035c501722f546ea96f2837a9566b541b99ca95fa7e4ca5690a1510"} Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.104037 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.104041 4735 scope.go:117] "RemoveContainer" containerID="3d6ddca9e97bf584bdb41585bffd9ce065c6cf145151aee229b5298b3cdb3f58" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.119940 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.119992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.120026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.120559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkx46\" (UniqueName: \"kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.121364 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.121578 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.131356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.134060 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.139245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkx46\" (UniqueName: \"kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46\") pod \"route-controller-manager-6f5ff97597-wh55z\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.139505 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c9678b8df-fz5vw"] Mar 17 01:13:50 crc kubenswrapper[4735]: I0317 01:13:50.169583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:13:51 crc kubenswrapper[4735]: I0317 01:13:51.083015 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0335b42-7be7-4b74-b722-0697f25b6252" path="/var/lib/kubelet/pods/e0335b42-7be7-4b74-b722-0697f25b6252/volumes" Mar 17 01:13:53 crc kubenswrapper[4735]: I0317 01:13:53.185255 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" Mar 17 01:13:54 crc kubenswrapper[4735]: E0317 01:13:54.286319 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 17 01:13:54 crc kubenswrapper[4735]: E0317 01:13:54.286914 4735 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 01:13:54 crc kubenswrapper[4735]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 17 01:13:54 crc kubenswrapper[4735]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hprwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29561832-p6vx6_openshift-infra(c4bd5744-869c-4763-af43-3ffcce4d549f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 17 01:13:54 crc kubenswrapper[4735]: > logger="UnhandledError" Mar 17 01:13:54 crc kubenswrapper[4735]: E0317 01:13:54.288061 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.312791 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.346596 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:13:54 crc kubenswrapper[4735]: E0317 01:13:54.346844 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerName="controller-manager" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.346872 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerName="controller-manager" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.346980 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" containerName="controller-manager" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.347356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.354268 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.405638 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b55wn\" (UniqueName: \"kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn\") pod \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.406028 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca\") pod \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.406258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles\") pod \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.406288 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config\") pod \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.406565 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert\") pod \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\" (UID: \"2189d27a-054a-4ce5-8548-ecc25f1c0b33\") " Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.407437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2189d27a-054a-4ce5-8548-ecc25f1c0b33" (UID: "2189d27a-054a-4ce5-8548-ecc25f1c0b33"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.407619 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca" (OuterVolumeSpecName: "client-ca") pod "2189d27a-054a-4ce5-8548-ecc25f1c0b33" (UID: "2189d27a-054a-4ce5-8548-ecc25f1c0b33"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.407711 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config" (OuterVolumeSpecName: "config") pod "2189d27a-054a-4ce5-8548-ecc25f1c0b33" (UID: "2189d27a-054a-4ce5-8548-ecc25f1c0b33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vml\" (UniqueName: \"kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408199 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408210 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.408218 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2189d27a-054a-4ce5-8548-ecc25f1c0b33-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.421627 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn" (OuterVolumeSpecName: "kube-api-access-b55wn") pod "2189d27a-054a-4ce5-8548-ecc25f1c0b33" (UID: "2189d27a-054a-4ce5-8548-ecc25f1c0b33"). InnerVolumeSpecName "kube-api-access-b55wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.426001 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2189d27a-054a-4ce5-8548-ecc25f1c0b33" (UID: "2189d27a-054a-4ce5-8548-ecc25f1c0b33"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.509533 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.509909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.509944 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.509963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.509995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vml\" (UniqueName: \"kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.510034 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2189d27a-054a-4ce5-8548-ecc25f1c0b33-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.510047 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b55wn\" (UniqueName: \"kubernetes.io/projected/2189d27a-054a-4ce5-8548-ecc25f1c0b33-kube-api-access-b55wn\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.510952 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.511030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.511424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.517000 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.524504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vml\" (UniqueName: \"kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml\") pod \"controller-manager-69d6db4b49-wmfzl\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.698639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:13:54 crc kubenswrapper[4735]: I0317 01:13:54.972392 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:13:55 crc kubenswrapper[4735]: I0317 01:13:55.157823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" event={"ID":"2189d27a-054a-4ce5-8548-ecc25f1c0b33","Type":"ContainerDied","Data":"5906cdb2b1a62366590457acb4857ff50003c18a8a2408ab8f9dd364e5c94921"} Mar 17 01:13:55 crc kubenswrapper[4735]: I0317 01:13:55.157890 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585487db7-xr8mx" Mar 17 01:13:55 crc kubenswrapper[4735]: E0317 01:13:55.159720 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" Mar 17 01:13:55 crc kubenswrapper[4735]: I0317 01:13:55.191511 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:55 crc kubenswrapper[4735]: I0317 01:13:55.233355 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-585487db7-xr8mx"] Mar 17 01:13:56 crc kubenswrapper[4735]: I0317 01:13:56.689058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 01:13:57 crc kubenswrapper[4735]: I0317 01:13:57.081159 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2189d27a-054a-4ce5-8548-ecc25f1c0b33" path="/var/lib/kubelet/pods/2189d27a-054a-4ce5-8548-ecc25f1c0b33/volumes" Mar 17 01:13:58 crc kubenswrapper[4735]: E0317 01:13:58.779095 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 01:13:58 crc kubenswrapper[4735]: E0317 01:13:58.779326 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fdtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mxnnp_openshift-marketplace(9f91af40-cf84-4b86-8aa5-fce087ec360d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:13:58 crc kubenswrapper[4735]: E0317 01:13:58.780517 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mxnnp" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.836121 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.837186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.841694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.841911 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.860657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.890053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.890143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.993808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.993930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:58 crc kubenswrapper[4735]: I0317 01:13:58.994036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:59 crc kubenswrapper[4735]: I0317 01:13:59.018816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:59 crc kubenswrapper[4735]: I0317 01:13:59.194444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:13:59 crc kubenswrapper[4735]: I0317 01:13:59.512245 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:13:59 crc kubenswrapper[4735]: I0317 01:13:59.612728 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.128480 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561834-t7858"] Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.129235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.132319 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.136486 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-t7858"] Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.215252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxn9\" (UniqueName: \"kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9\") pod \"auto-csr-approver-29561834-t7858\" (UID: \"d726aceb-cffe-4667-9136-795a1442e125\") " pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.315627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxn9\" (UniqueName: \"kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9\") pod \"auto-csr-approver-29561834-t7858\" (UID: \"d726aceb-cffe-4667-9136-795a1442e125\") " pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.330428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxn9\" (UniqueName: \"kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9\") pod \"auto-csr-approver-29561834-t7858\" (UID: \"d726aceb-cffe-4667-9136-795a1442e125\") " pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:00 crc kubenswrapper[4735]: I0317 01:14:00.450130 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.238354 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mxnnp" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.310018 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.310171 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7d8fr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d4h2p_openshift-marketplace(277daa3a-bd0c-46b3-915f-1050fbfa37ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.311445 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d4h2p" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.333209 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.333367 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h67q9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bnkg8_openshift-marketplace(0d63aa4e-027e-4486-8b30-b7b583c57b3a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:01 crc kubenswrapper[4735]: E0317 01:14:01.334569 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bnkg8" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.627251 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.628648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.636994 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.783276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.783341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.783364 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.884652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.884713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.884744 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.884785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.884905 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.913709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:04 crc kubenswrapper[4735]: I0317 01:14:04.955003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.159305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bnkg8" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.159546 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d4h2p" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.170579 4735 scope.go:117] "RemoveContainer" containerID="e1e5c5278b3bd821525ee8ef2659a6e7b5fa5f02eed7b5e0abe452f67902cc80" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.313736 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.313899 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mm4vw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s9pzj_openshift-marketplace(9766c85a-73b3-42ba-90c1-c4c93493d138): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.320912 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s9pzj" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.320946 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.321055 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwdhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-46vft_openshift-marketplace(54e396e2-3911-4d16-9ff4-588b49a8a77c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.327545 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-46vft" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.338629 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.338754 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhk75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5tdwt_openshift-marketplace(9fb14281-df17-4753-ba53-e292dcb071fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.340502 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5tdwt" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.513607 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.513997 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xfcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wd6n6_openshift-marketplace(00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:14:06 crc kubenswrapper[4735]: E0317 01:14:06.515642 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wd6n6" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.706513 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:14:06 crc kubenswrapper[4735]: W0317 01:14:06.708994 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17a5c137_93aa_4aa0_9801_b235f221c58e.slice/crio-6d22fff0528f5a7277a7ab3b0668dd396075dcea9fce6aee4b4e2b1b40784f41 WatchSource:0}: Error finding container 6d22fff0528f5a7277a7ab3b0668dd396075dcea9fce6aee4b4e2b1b40784f41: Status 404 returned error can't find the container with id 6d22fff0528f5a7277a7ab3b0668dd396075dcea9fce6aee4b4e2b1b40784f41 Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.759589 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.786597 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.878116 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-t7858"] Mar 17 01:14:06 crc kubenswrapper[4735]: I0317 01:14:06.893911 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 01:14:06 crc kubenswrapper[4735]: W0317 01:14:06.899061 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c97756b_7a02_446c_b81c_efee6a00f860.slice/crio-4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4 WatchSource:0}: Error finding container 4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4: Status 404 returned error can't find the container with id 4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4 Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.279191 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" event={"ID":"7ff174d1-76dd-4061-a15b-5e50db28a0be","Type":"ContainerStarted","Data":"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.279545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.279557 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" event={"ID":"7ff174d1-76dd-4061-a15b-5e50db28a0be","Type":"ContainerStarted","Data":"d2f98c20199373fedd853f0065b651f68dfea28d09966d5d85fa13b06d3b894e"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.279269 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerName="controller-manager" containerID="cri-o://f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9" gracePeriod=30 Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.282740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c97756b-7a02-446c-b81c-efee6a00f860","Type":"ContainerStarted","Data":"14ef6ad67bcc23c39baeb4c07257091f62b41367975b819ec68b83ee69afe061"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.282774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c97756b-7a02-446c-b81c-efee6a00f860","Type":"ContainerStarted","Data":"4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.284652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-t7858" event={"ID":"d726aceb-cffe-4667-9136-795a1442e125","Type":"ContainerStarted","Data":"aca66820469ab43d8c0122bc21f8ca7546d1b9850b16250fc49d5b981698bc56"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.288364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkwf5" event={"ID":"3a72fe2c-32fb-4360-882b-44debb825c9e","Type":"ContainerStarted","Data":"8d32d1dfcbd7cf05c799e85ce81070de1629e4b04ed59ed33ca9012f3825de0a"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.289829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" event={"ID":"17a5c137-93aa-4aa0-9801-b235f221c58e","Type":"ContainerStarted","Data":"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.289976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" event={"ID":"17a5c137-93aa-4aa0-9801-b235f221c58e","Type":"ContainerStarted","Data":"6d22fff0528f5a7277a7ab3b0668dd396075dcea9fce6aee4b4e2b1b40784f41"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.290041 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.289908 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" podUID="17a5c137-93aa-4aa0-9801-b235f221c58e" containerName="route-controller-manager" containerID="cri-o://f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96" gracePeriod=30 Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.292457 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerID="1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23" exitCode=0 Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.292504 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerDied","Data":"1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.296674 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"59343d58-3315-46bf-becf-2f99a19e0116","Type":"ContainerStarted","Data":"ff1a9ba0acd174c82ae11487767ce1f9276f7df524c46cdc2f466e41761d207d"} Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.296696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"59343d58-3315-46bf-becf-2f99a19e0116","Type":"ContainerStarted","Data":"84d77dd98fc6fe635f257351fe36143ad490b4370261e8f808aa7ad744e2a26e"} Mar 17 01:14:07 crc kubenswrapper[4735]: E0317 01:14:07.298773 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5tdwt" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" Mar 17 01:14:07 crc kubenswrapper[4735]: E0317 01:14:07.298823 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-46vft" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" Mar 17 01:14:07 crc kubenswrapper[4735]: E0317 01:14:07.298988 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wd6n6" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" Mar 17 01:14:07 crc kubenswrapper[4735]: E0317 01:14:07.299138 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s9pzj" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.300149 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.314972 4735 patch_prober.go:28] interesting pod/controller-manager-69d6db4b49-wmfzl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": EOF" start-of-body= Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.315029 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": EOF" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.334014 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" podStartSLOduration=28.333996564 podStartE2EDuration="28.333996564s" podCreationTimestamp="2026-03-17 01:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:07.311254676 +0000 UTC m=+272.943487654" watchObservedRunningTime="2026-03-17 01:14:07.333996564 +0000 UTC m=+272.966229542" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.379691 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=9.379674575 podStartE2EDuration="9.379674575s" podCreationTimestamp="2026-03-17 01:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:07.37710508 +0000 UTC m=+273.009338058" watchObservedRunningTime="2026-03-17 01:14:07.379674575 +0000 UTC m=+273.011907553" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.426989 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.426969997 podStartE2EDuration="3.426969997s" podCreationTimestamp="2026-03-17 01:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:07.423608082 +0000 UTC m=+273.055841060" watchObservedRunningTime="2026-03-17 01:14:07.426969997 +0000 UTC m=+273.059202975" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.455718 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" podStartSLOduration=28.455702838 podStartE2EDuration="28.455702838s" podCreationTimestamp="2026-03-17 01:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:07.445033207 +0000 UTC m=+273.077266205" watchObservedRunningTime="2026-03-17 01:14:07.455702838 +0000 UTC m=+273.087935816" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.727394 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.751808 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dkwf5" podStartSLOduration=207.751792566 podStartE2EDuration="3m27.751792566s" podCreationTimestamp="2026-03-17 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:07.487487016 +0000 UTC m=+273.119720014" watchObservedRunningTime="2026-03-17 01:14:07.751792566 +0000 UTC m=+273.384025544" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.755997 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:07 crc kubenswrapper[4735]: E0317 01:14:07.756248 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a5c137-93aa-4aa0-9801-b235f221c58e" containerName="route-controller-manager" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.756259 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a5c137-93aa-4aa0-9801-b235f221c58e" containerName="route-controller-manager" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.756390 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a5c137-93aa-4aa0-9801-b235f221c58e" containerName="route-controller-manager" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.756880 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.775995 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.797209 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.835101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert\") pod \"17a5c137-93aa-4aa0-9801-b235f221c58e\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.835155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca\") pod \"17a5c137-93aa-4aa0-9801-b235f221c58e\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.835186 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config\") pod \"17a5c137-93aa-4aa0-9801-b235f221c58e\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.835259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkx46\" (UniqueName: \"kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46\") pod \"17a5c137-93aa-4aa0-9801-b235f221c58e\" (UID: \"17a5c137-93aa-4aa0-9801-b235f221c58e\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.836028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config" (OuterVolumeSpecName: "config") pod "17a5c137-93aa-4aa0-9801-b235f221c58e" (UID: "17a5c137-93aa-4aa0-9801-b235f221c58e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.836546 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca" (OuterVolumeSpecName: "client-ca") pod "17a5c137-93aa-4aa0-9801-b235f221c58e" (UID: "17a5c137-93aa-4aa0-9801-b235f221c58e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.840223 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46" (OuterVolumeSpecName: "kube-api-access-mkx46") pod "17a5c137-93aa-4aa0-9801-b235f221c58e" (UID: "17a5c137-93aa-4aa0-9801-b235f221c58e"). InnerVolumeSpecName "kube-api-access-mkx46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.840715 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17a5c137-93aa-4aa0-9801-b235f221c58e" (UID: "17a5c137-93aa-4aa0-9801-b235f221c58e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.936448 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config\") pod \"7ff174d1-76dd-4061-a15b-5e50db28a0be\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.936548 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert\") pod \"7ff174d1-76dd-4061-a15b-5e50db28a0be\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.936731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles\") pod \"7ff174d1-76dd-4061-a15b-5e50db28a0be\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.936769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca\") pod \"7ff174d1-76dd-4061-a15b-5e50db28a0be\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.936822 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6vml\" (UniqueName: \"kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml\") pod \"7ff174d1-76dd-4061-a15b-5e50db28a0be\" (UID: \"7ff174d1-76dd-4061-a15b-5e50db28a0be\") " Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ff174d1-76dd-4061-a15b-5e50db28a0be" (UID: "7ff174d1-76dd-4061-a15b-5e50db28a0be"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ff174d1-76dd-4061-a15b-5e50db28a0be" (UID: "7ff174d1-76dd-4061-a15b-5e50db28a0be"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config" (OuterVolumeSpecName: "config") pod "7ff174d1-76dd-4061-a15b-5e50db28a0be" (UID: "7ff174d1-76dd-4061-a15b-5e50db28a0be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937491 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937744 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5bgb\" (UniqueName: \"kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937872 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a5c137-93aa-4aa0-9801-b235f221c58e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937892 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937902 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937911 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937920 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a5c137-93aa-4aa0-9801-b235f221c58e-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937929 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff174d1-76dd-4061-a15b-5e50db28a0be-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.937937 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkx46\" (UniqueName: \"kubernetes.io/projected/17a5c137-93aa-4aa0-9801-b235f221c58e-kube-api-access-mkx46\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.941124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml" (OuterVolumeSpecName: "kube-api-access-r6vml") pod "7ff174d1-76dd-4061-a15b-5e50db28a0be" (UID: "7ff174d1-76dd-4061-a15b-5e50db28a0be"). InnerVolumeSpecName "kube-api-access-r6vml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:07 crc kubenswrapper[4735]: I0317 01:14:07.943034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ff174d1-76dd-4061-a15b-5e50db28a0be" (UID: "7ff174d1-76dd-4061-a15b-5e50db28a0be"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5bgb\" (UniqueName: \"kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038515 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038590 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6vml\" (UniqueName: \"kubernetes.io/projected/7ff174d1-76dd-4061-a15b-5e50db28a0be-kube-api-access-r6vml\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.038601 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff174d1-76dd-4061-a15b-5e50db28a0be-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.039793 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.040828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.042396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.052992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5bgb\" (UniqueName: \"kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb\") pod \"route-controller-manager-5cbdbc955d-qlmw6\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.104009 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.274681 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.304062 4735 generic.go:334] "Generic (PLEG): container finished" podID="59343d58-3315-46bf-becf-2f99a19e0116" containerID="ff1a9ba0acd174c82ae11487767ce1f9276f7df524c46cdc2f466e41761d207d" exitCode=0 Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.304147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"59343d58-3315-46bf-becf-2f99a19e0116","Type":"ContainerDied","Data":"ff1a9ba0acd174c82ae11487767ce1f9276f7df524c46cdc2f466e41761d207d"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.307487 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerID="f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9" exitCode=0 Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.307528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" event={"ID":"7ff174d1-76dd-4061-a15b-5e50db28a0be","Type":"ContainerDied","Data":"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.307550 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.307566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6db4b49-wmfzl" event={"ID":"7ff174d1-76dd-4061-a15b-5e50db28a0be","Type":"ContainerDied","Data":"d2f98c20199373fedd853f0065b651f68dfea28d09966d5d85fa13b06d3b894e"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.307581 4735 scope.go:117] "RemoveContainer" containerID="f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.309276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" event={"ID":"9973b688-f657-4c1d-9d92-d8e743a8f030","Type":"ContainerStarted","Data":"078d172be12ac4b6fd256bb03bf5d5969f64cf64be1934f78f633a1041018242"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.310121 4735 generic.go:334] "Generic (PLEG): container finished" podID="17a5c137-93aa-4aa0-9801-b235f221c58e" containerID="f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96" exitCode=0 Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.310639 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.312734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" event={"ID":"17a5c137-93aa-4aa0-9801-b235f221c58e","Type":"ContainerDied","Data":"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.312870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z" event={"ID":"17a5c137-93aa-4aa0-9801-b235f221c58e","Type":"ContainerDied","Data":"6d22fff0528f5a7277a7ab3b0668dd396075dcea9fce6aee4b4e2b1b40784f41"} Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.333993 4735 scope.go:117] "RemoveContainer" containerID="f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9" Mar 17 01:14:08 crc kubenswrapper[4735]: E0317 01:14:08.334761 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9\": container with ID starting with f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9 not found: ID does not exist" containerID="f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.334805 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9"} err="failed to get container status \"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9\": rpc error: code = NotFound desc = could not find container \"f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9\": container with ID starting with f9af6f78b7a6573f2de50124a03c777030091705fca918c0e33ff5f198a88fe9 not found: ID does not exist" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.334834 4735 scope.go:117] "RemoveContainer" containerID="f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.350050 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.360422 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69d6db4b49-wmfzl"] Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.363293 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.366164 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ff97597-wh55z"] Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.369206 4735 scope.go:117] "RemoveContainer" containerID="f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96" Mar 17 01:14:08 crc kubenswrapper[4735]: E0317 01:14:08.370667 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96\": container with ID starting with f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96 not found: ID does not exist" containerID="f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96" Mar 17 01:14:08 crc kubenswrapper[4735]: I0317 01:14:08.370700 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96"} err="failed to get container status \"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96\": rpc error: code = NotFound desc = could not find container \"f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96\": container with ID starting with f4a1e6a8e08020ecbca2297190df6240705d60d35cdeae23dd0416bfe9030b96 not found: ID does not exist" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.082110 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a5c137-93aa-4aa0-9801-b235f221c58e" path="/var/lib/kubelet/pods/17a5c137-93aa-4aa0-9801-b235f221c58e/volumes" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.083600 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" path="/var/lib/kubelet/pods/7ff174d1-76dd-4061-a15b-5e50db28a0be/volumes" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.320117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" event={"ID":"9973b688-f657-4c1d-9d92-d8e743a8f030","Type":"ContainerStarted","Data":"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba"} Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.321074 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.327804 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.341805 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" podStartSLOduration=10.34178462 podStartE2EDuration="10.34178462s" podCreationTimestamp="2026-03-17 01:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:09.334188136 +0000 UTC m=+274.966421114" watchObservedRunningTime="2026-03-17 01:14:09.34178462 +0000 UTC m=+274.974017598" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.588194 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.671351 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access\") pod \"59343d58-3315-46bf-becf-2f99a19e0116\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.671389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir\") pod \"59343d58-3315-46bf-becf-2f99a19e0116\" (UID: \"59343d58-3315-46bf-becf-2f99a19e0116\") " Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.671582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "59343d58-3315-46bf-becf-2f99a19e0116" (UID: "59343d58-3315-46bf-becf-2f99a19e0116"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.678983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "59343d58-3315-46bf-becf-2f99a19e0116" (UID: "59343d58-3315-46bf-becf-2f99a19e0116"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.773326 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59343d58-3315-46bf-becf-2f99a19e0116-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.773356 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59343d58-3315-46bf-becf-2f99a19e0116-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.912708 4735 csr.go:261] certificate signing request csr-fqxfg is approved, waiting to be issued Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.918509 4735 csr.go:257] certificate signing request csr-fqxfg is issued Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.953882 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:09 crc kubenswrapper[4735]: E0317 01:14:09.954100 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerName="controller-manager" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.954111 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerName="controller-manager" Mar 17 01:14:09 crc kubenswrapper[4735]: E0317 01:14:09.954119 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59343d58-3315-46bf-becf-2f99a19e0116" containerName="pruner" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.954124 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="59343d58-3315-46bf-becf-2f99a19e0116" containerName="pruner" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.954215 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff174d1-76dd-4061-a15b-5e50db28a0be" containerName="controller-manager" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.954227 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="59343d58-3315-46bf-becf-2f99a19e0116" containerName="pruner" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.954675 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.956413 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.957522 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.958374 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.958762 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.959121 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.960131 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.967054 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:09 crc kubenswrapper[4735]: I0317 01:14:09.967309 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.077541 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.077590 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78nd\" (UniqueName: \"kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.077633 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.077654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.077697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.178632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.178695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.178730 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f78nd\" (UniqueName: \"kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.178767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.178788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.179751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.181610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.182442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.183925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.209562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78nd\" (UniqueName: \"kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd\") pod \"controller-manager-db6599c86-xn6dd\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.300421 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.334369 4735 generic.go:334] "Generic (PLEG): container finished" podID="d726aceb-cffe-4667-9136-795a1442e125" containerID="4c2ae0a64a50cfc874ce1cb1c4c7af1fc8c6f946744975423d57f0681d565aeb" exitCode=0 Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.334430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-t7858" event={"ID":"d726aceb-cffe-4667-9136-795a1442e125","Type":"ContainerDied","Data":"4c2ae0a64a50cfc874ce1cb1c4c7af1fc8c6f946744975423d57f0681d565aeb"} Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.342058 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerStarted","Data":"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb"} Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.344063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"59343d58-3315-46bf-becf-2f99a19e0116","Type":"ContainerDied","Data":"84d77dd98fc6fe635f257351fe36143ad490b4370261e8f808aa7ad744e2a26e"} Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.344095 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84d77dd98fc6fe635f257351fe36143ad490b4370261e8f808aa7ad744e2a26e" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.344259 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.757436 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-db76h" podStartSLOduration=6.259272597 podStartE2EDuration="48.757416261s" podCreationTimestamp="2026-03-17 01:13:22 +0000 UTC" firstStartedPulling="2026-03-17 01:13:26.364902915 +0000 UTC m=+231.997135893" lastFinishedPulling="2026-03-17 01:14:08.863046589 +0000 UTC m=+274.495279557" observedRunningTime="2026-03-17 01:14:10.385153247 +0000 UTC m=+276.017386225" watchObservedRunningTime="2026-03-17 01:14:10.757416261 +0000 UTC m=+276.389649229" Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.760194 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:10 crc kubenswrapper[4735]: W0317 01:14:10.768141 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda356bcea_aa88_4348_9262_efedf2873cb1.slice/crio-befa14a98ed517567385b0458f031bfa4a6d3bdc6246d640fd9919e2209a63ee WatchSource:0}: Error finding container befa14a98ed517567385b0458f031bfa4a6d3bdc6246d640fd9919e2209a63ee: Status 404 returned error can't find the container with id befa14a98ed517567385b0458f031bfa4a6d3bdc6246d640fd9919e2209a63ee Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.920436 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-27 12:49:48.647298755 +0000 UTC Mar 17 01:14:10 crc kubenswrapper[4735]: I0317 01:14:10.920511 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6131h35m37.726799618s for next certificate rotation Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.349433 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4bd5744-869c-4763-af43-3ffcce4d549f" containerID="cee9fac3b6f9b8dde02e0a30a2a9c05b4f0af2523f31e5881aca57a1c632018d" exitCode=0 Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.349498 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" event={"ID":"c4bd5744-869c-4763-af43-3ffcce4d549f","Type":"ContainerDied","Data":"cee9fac3b6f9b8dde02e0a30a2a9c05b4f0af2523f31e5881aca57a1c632018d"} Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.351141 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" event={"ID":"a356bcea-aa88-4348-9262-efedf2873cb1","Type":"ContainerStarted","Data":"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2"} Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.351184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" event={"ID":"a356bcea-aa88-4348-9262-efedf2873cb1","Type":"ContainerStarted","Data":"befa14a98ed517567385b0458f031bfa4a6d3bdc6246d640fd9919e2209a63ee"} Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.380801 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" podStartSLOduration=12.38078201 podStartE2EDuration="12.38078201s" podCreationTimestamp="2026-03-17 01:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:11.379745773 +0000 UTC m=+277.011978751" watchObservedRunningTime="2026-03-17 01:14:11.38078201 +0000 UTC m=+277.013014988" Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.735135 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.864764 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtxn9\" (UniqueName: \"kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9\") pod \"d726aceb-cffe-4667-9136-795a1442e125\" (UID: \"d726aceb-cffe-4667-9136-795a1442e125\") " Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.880498 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9" (OuterVolumeSpecName: "kube-api-access-rtxn9") pod "d726aceb-cffe-4667-9136-795a1442e125" (UID: "d726aceb-cffe-4667-9136-795a1442e125"). InnerVolumeSpecName "kube-api-access-rtxn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.920777 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 01:12:50.532166273 +0000 UTC Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.920827 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6383h58m38.611342883s for next certificate rotation Mar 17 01:14:11 crc kubenswrapper[4735]: I0317 01:14:11.966371 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtxn9\" (UniqueName: \"kubernetes.io/projected/d726aceb-cffe-4667-9136-795a1442e125-kube-api-access-rtxn9\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.359095 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-t7858" event={"ID":"d726aceb-cffe-4667-9136-795a1442e125","Type":"ContainerDied","Data":"aca66820469ab43d8c0122bc21f8ca7546d1b9850b16250fc49d5b981698bc56"} Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.359152 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aca66820469ab43d8c0122bc21f8ca7546d1b9850b16250fc49d5b981698bc56" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.359155 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-t7858" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.359504 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.366989 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.606100 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.606146 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.606200 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.610355 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.610440 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b" gracePeriod=600 Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.635185 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.785715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hprwb\" (UniqueName: \"kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb\") pod \"c4bd5744-869c-4763-af43-3ffcce4d549f\" (UID: \"c4bd5744-869c-4763-af43-3ffcce4d549f\") " Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.790755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb" (OuterVolumeSpecName: "kube-api-access-hprwb") pod "c4bd5744-869c-4763-af43-3ffcce4d549f" (UID: "c4bd5744-869c-4763-af43-3ffcce4d549f"). InnerVolumeSpecName "kube-api-access-hprwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:12 crc kubenswrapper[4735]: I0317 01:14:12.887657 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hprwb\" (UniqueName: \"kubernetes.io/projected/c4bd5744-869c-4763-af43-3ffcce4d549f-kube-api-access-hprwb\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.364710 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.365383 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.371966 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b" exitCode=0 Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.372076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b"} Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.372108 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca"} Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.378013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" event={"ID":"c4bd5744-869c-4763-af43-3ffcce4d549f","Type":"ContainerDied","Data":"789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e"} Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.378089 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789fe4a4cfb775bba4a4a34b693b5a8482014a8eb04f37e3039427226b6ca56e" Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.378214 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-p6vx6" Mar 17 01:14:13 crc kubenswrapper[4735]: I0317 01:14:13.726490 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:14:14 crc kubenswrapper[4735]: I0317 01:14:14.448709 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:14:17 crc kubenswrapper[4735]: I0317 01:14:17.407187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerStarted","Data":"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce"} Mar 17 01:14:18 crc kubenswrapper[4735]: I0317 01:14:18.421252 4735 generic.go:334] "Generic (PLEG): container finished" podID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerID="687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce" exitCode=0 Mar 17 01:14:18 crc kubenswrapper[4735]: I0317 01:14:18.421326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerDied","Data":"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce"} Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.432091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerStarted","Data":"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49"} Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.435061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerStarted","Data":"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5"} Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.486981 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mxnnp" podStartSLOduration=3.4592626060000002 podStartE2EDuration="57.486959542s" podCreationTimestamp="2026-03-17 01:13:22 +0000 UTC" firstStartedPulling="2026-03-17 01:13:25.075845388 +0000 UTC m=+230.708078366" lastFinishedPulling="2026-03-17 01:14:19.103542284 +0000 UTC m=+284.735775302" observedRunningTime="2026-03-17 01:14:19.481624656 +0000 UTC m=+285.113857674" watchObservedRunningTime="2026-03-17 01:14:19.486959542 +0000 UTC m=+285.119192530" Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.568899 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.569112 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" podUID="a356bcea-aa88-4348-9262-efedf2873cb1" containerName="controller-manager" containerID="cri-o://e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2" gracePeriod=30 Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.599518 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:19 crc kubenswrapper[4735]: I0317 01:14:19.600083 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" podUID="9973b688-f657-4c1d-9d92-d8e743a8f030" containerName="route-controller-manager" containerID="cri-o://5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba" gracePeriod=30 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.015744 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" containerID="cri-o://e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab" gracePeriod=15 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.124342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.210274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config\") pod \"9973b688-f657-4c1d-9d92-d8e743a8f030\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.210686 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5bgb\" (UniqueName: \"kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb\") pod \"9973b688-f657-4c1d-9d92-d8e743a8f030\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.210741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert\") pod \"9973b688-f657-4c1d-9d92-d8e743a8f030\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.210781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca\") pod \"9973b688-f657-4c1d-9d92-d8e743a8f030\" (UID: \"9973b688-f657-4c1d-9d92-d8e743a8f030\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.211556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config" (OuterVolumeSpecName: "config") pod "9973b688-f657-4c1d-9d92-d8e743a8f030" (UID: "9973b688-f657-4c1d-9d92-d8e743a8f030"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.212267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca" (OuterVolumeSpecName: "client-ca") pod "9973b688-f657-4c1d-9d92-d8e743a8f030" (UID: "9973b688-f657-4c1d-9d92-d8e743a8f030"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.231124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9973b688-f657-4c1d-9d92-d8e743a8f030" (UID: "9973b688-f657-4c1d-9d92-d8e743a8f030"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.234402 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb" (OuterVolumeSpecName: "kube-api-access-d5bgb") pod "9973b688-f657-4c1d-9d92-d8e743a8f030" (UID: "9973b688-f657-4c1d-9d92-d8e743a8f030"). InnerVolumeSpecName "kube-api-access-d5bgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.260136 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca\") pod \"a356bcea-aa88-4348-9262-efedf2873cb1\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config\") pod \"a356bcea-aa88-4348-9262-efedf2873cb1\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312562 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert\") pod \"a356bcea-aa88-4348-9262-efedf2873cb1\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312609 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f78nd\" (UniqueName: \"kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd\") pod \"a356bcea-aa88-4348-9262-efedf2873cb1\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312648 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles\") pod \"a356bcea-aa88-4348-9262-efedf2873cb1\" (UID: \"a356bcea-aa88-4348-9262-efedf2873cb1\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312887 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5bgb\" (UniqueName: \"kubernetes.io/projected/9973b688-f657-4c1d-9d92-d8e743a8f030-kube-api-access-d5bgb\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312903 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9973b688-f657-4c1d-9d92-d8e743a8f030-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312919 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.312928 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9973b688-f657-4c1d-9d92-d8e743a8f030-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.313610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a356bcea-aa88-4348-9262-efedf2873cb1" (UID: "a356bcea-aa88-4348-9262-efedf2873cb1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.314054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config" (OuterVolumeSpecName: "config") pod "a356bcea-aa88-4348-9262-efedf2873cb1" (UID: "a356bcea-aa88-4348-9262-efedf2873cb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.314308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a356bcea-aa88-4348-9262-efedf2873cb1" (UID: "a356bcea-aa88-4348-9262-efedf2873cb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.320324 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd" (OuterVolumeSpecName: "kube-api-access-f78nd") pod "a356bcea-aa88-4348-9262-efedf2873cb1" (UID: "a356bcea-aa88-4348-9262-efedf2873cb1"). InnerVolumeSpecName "kube-api-access-f78nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.337690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a356bcea-aa88-4348-9262-efedf2873cb1" (UID: "a356bcea-aa88-4348-9262-efedf2873cb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.413280 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f78nd\" (UniqueName: \"kubernetes.io/projected/a356bcea-aa88-4348-9262-efedf2873cb1-kube-api-access-f78nd\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.413307 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.413317 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.413328 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a356bcea-aa88-4348-9262-efedf2873cb1-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.413336 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a356bcea-aa88-4348-9262-efedf2873cb1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.436266 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.444377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerStarted","Data":"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.445658 4735 generic.go:334] "Generic (PLEG): container finished" podID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerID="e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab" exitCode=0 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.445731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" event={"ID":"6bdc2a09-1108-46eb-bfad-41ddee55a93c","Type":"ContainerDied","Data":"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.445760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" event={"ID":"6bdc2a09-1108-46eb-bfad-41ddee55a93c","Type":"ContainerDied","Data":"f3b7ea804ccf5c92e1e809a02e83f8ee868ae9ef311fafcae276fd969773cd00"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.445776 4735 scope.go:117] "RemoveContainer" containerID="e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.445904 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zgxs6" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.450615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerStarted","Data":"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.452590 4735 generic.go:334] "Generic (PLEG): container finished" podID="9973b688-f657-4c1d-9d92-d8e743a8f030" containerID="5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba" exitCode=0 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.452647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" event={"ID":"9973b688-f657-4c1d-9d92-d8e743a8f030","Type":"ContainerDied","Data":"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.452664 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" event={"ID":"9973b688-f657-4c1d-9d92-d8e743a8f030","Type":"ContainerDied","Data":"078d172be12ac4b6fd256bb03bf5d5969f64cf64be1934f78f633a1041018242"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.452707 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.461136 4735 scope.go:117] "RemoveContainer" containerID="e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.461177 4735 generic.go:334] "Generic (PLEG): container finished" podID="a356bcea-aa88-4348-9262-efedf2873cb1" containerID="e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2" exitCode=0 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.461238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" event={"ID":"a356bcea-aa88-4348-9262-efedf2873cb1","Type":"ContainerDied","Data":"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.461266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" event={"ID":"a356bcea-aa88-4348-9262-efedf2873cb1","Type":"ContainerDied","Data":"befa14a98ed517567385b0458f031bfa4a6d3bdc6246d640fd9919e2209a63ee"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.461292 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db6599c86-xn6dd" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.467744 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab\": container with ID starting with e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab not found: ID does not exist" containerID="e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.467915 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab"} err="failed to get container status \"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab\": rpc error: code = NotFound desc = could not find container \"e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab\": container with ID starting with e3c7121ba99a06b6d323269ad776ffebe1f64570ae9d0326c5a9cc0e37971cab not found: ID does not exist" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.468003 4735 scope.go:117] "RemoveContainer" containerID="5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.476634 4735 generic.go:334] "Generic (PLEG): container finished" podID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerID="ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49" exitCode=0 Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.476678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerDied","Data":"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49"} Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.502434 4735 scope.go:117] "RemoveContainer" containerID="5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.502947 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba\": container with ID starting with 5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba not found: ID does not exist" containerID="5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.502975 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba"} err="failed to get container status \"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba\": rpc error: code = NotFound desc = could not find container \"5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba\": container with ID starting with 5a8ae3c1b206867343fae001363d6dab8f4554cee1afa485cf7f76f97d3d1bba not found: ID does not exist" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.503054 4735 scope.go:117] "RemoveContainer" containerID="e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.518610 4735 scope.go:117] "RemoveContainer" containerID="e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.519082 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2\": container with ID starting with e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2 not found: ID does not exist" containerID="e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.519111 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2"} err="failed to get container status \"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2\": rpc error: code = NotFound desc = could not find container \"e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2\": container with ID starting with e9e58185ccb08353847257c4e4804851fd6176976da633218b6e9786756757a2 not found: ID does not exist" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.547103 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.551380 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbdbc955d-qlmw6"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.581512 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.585801 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-db6599c86-xn6dd"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619146 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619253 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619315 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z9mn\" (UniqueName: \"kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619438 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619469 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619498 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619525 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.619666 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig\") pod \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\" (UID: \"6bdc2a09-1108-46eb-bfad-41ddee55a93c\") " Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.620994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.621982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.625661 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.626344 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.626382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.626783 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn" (OuterVolumeSpecName: "kube-api-access-2z9mn") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "kube-api-access-2z9mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.626810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.626841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.627582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.628042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.635192 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.636687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.637304 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.644064 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6bdc2a09-1108-46eb-bfad-41ddee55a93c" (UID: "6bdc2a09-1108-46eb-bfad-41ddee55a93c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721241 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721280 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721290 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bdc2a09-1108-46eb-bfad-41ddee55a93c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721299 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z9mn\" (UniqueName: \"kubernetes.io/projected/6bdc2a09-1108-46eb-bfad-41ddee55a93c-kube-api-access-2z9mn\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721309 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721320 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721328 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721337 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721347 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721357 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721366 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721375 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721384 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.721392 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bdc2a09-1108-46eb-bfad-41ddee55a93c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.777331 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.783047 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zgxs6"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.962679 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.963148 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.963287 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.963375 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9973b688-f657-4c1d-9d92-d8e743a8f030" containerName="route-controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.967687 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9973b688-f657-4c1d-9d92-d8e743a8f030" containerName="route-controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.967901 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a356bcea-aa88-4348-9262-efedf2873cb1" containerName="controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968036 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a356bcea-aa88-4348-9262-efedf2873cb1" containerName="controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.968116 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968186 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" Mar 17 01:14:20 crc kubenswrapper[4735]: E0317 01:14:20.968270 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d726aceb-cffe-4667-9136-795a1442e125" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968341 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d726aceb-cffe-4667-9136-795a1442e125" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968613 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a356bcea-aa88-4348-9262-efedf2873cb1" containerName="controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968713 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968801 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" containerName="oauth-openshift" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.968908 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9973b688-f657-4c1d-9d92-d8e743a8f030" containerName="route-controller-manager" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.969001 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d726aceb-cffe-4667-9136-795a1442e125" containerName="oc" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.969498 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.969787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.979339 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.980512 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.981767 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.982092 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.982539 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.982799 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.983012 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.983459 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.983492 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.983794 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.984042 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.984145 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.984455 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.984549 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.993146 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:14:20 crc kubenswrapper[4735]: I0317 01:14:20.997695 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.026827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xmg\" (UniqueName: \"kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027441 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027523 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027830 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.027951 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.028053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqn7c\" (UniqueName: \"kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.083970 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdc2a09-1108-46eb-bfad-41ddee55a93c" path="/var/lib/kubelet/pods/6bdc2a09-1108-46eb-bfad-41ddee55a93c/volumes" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.084454 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9973b688-f657-4c1d-9d92-d8e743a8f030" path="/var/lib/kubelet/pods/9973b688-f657-4c1d-9d92-d8e743a8f030/volumes" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.084928 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a356bcea-aa88-4348-9262-efedf2873cb1" path="/var/lib/kubelet/pods/a356bcea-aa88-4348-9262-efedf2873cb1/volumes" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129464 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqn7c\" (UniqueName: \"kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xmg\" (UniqueName: \"kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129591 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.129633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.131023 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.131293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.132524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.132648 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.132679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.135996 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.136732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.152200 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xmg\" (UniqueName: \"kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg\") pod \"route-controller-manager-54d64694d7-65tmq\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.158884 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqn7c\" (UniqueName: \"kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c\") pod \"controller-manager-64dcdbcd69-87x7s\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.311012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.318327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.493380 4735 generic.go:334] "Generic (PLEG): container finished" podID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerID="dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1" exitCode=0 Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.493636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerDied","Data":"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1"} Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.521202 4735 generic.go:334] "Generic (PLEG): container finished" podID="9fb14281-df17-4753-ba53-e292dcb071fa" containerID="6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605" exitCode=0 Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.521241 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerDied","Data":"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605"} Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.788022 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:21 crc kubenswrapper[4735]: W0317 01:14:21.859809 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb4d4b3_3446_47ec_b3e7_2ad34071d332.slice/crio-d526878a3daa4af26e7f1adcd9e270478984bc88f7ad665301813d74d7e50af2 WatchSource:0}: Error finding container d526878a3daa4af26e7f1adcd9e270478984bc88f7ad665301813d74d7e50af2: Status 404 returned error can't find the container with id d526878a3daa4af26e7f1adcd9e270478984bc88f7ad665301813d74d7e50af2 Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.863023 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.964371 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z"] Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.965266 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970231 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970273 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970495 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970557 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970752 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-policies\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970941 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.970973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-dir\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.971072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.971126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxth5\" (UniqueName: \"kubernetes.io/projected/d519ee60-0e44-4a4f-a408-977b544cf2ef-kube-api-access-mxth5\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.971167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.974871 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.974846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.976345 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.976427 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.976692 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.976694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.976839 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.978186 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.983997 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.988359 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.991779 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 01:14:21 crc kubenswrapper[4735]: I0317 01:14:21.998802 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.003040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z"] Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.007601 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.071999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072068 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-policies\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-dir\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxth5\" (UniqueName: \"kubernetes.io/projected/d519ee60-0e44-4a4f-a408-977b544cf2ef-kube-api-access-mxth5\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.072636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.073988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.074106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-policies\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.074322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d519ee60-0e44-4a4f-a408-977b544cf2ef-audit-dir\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.074958 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.075575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.080668 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.080750 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.080985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.081207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.082103 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.083938 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.084011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.088461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d519ee60-0e44-4a4f-a408-977b544cf2ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.105580 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxth5\" (UniqueName: \"kubernetes.io/projected/d519ee60-0e44-4a4f-a408-977b544cf2ef-kube-api-access-mxth5\") pod \"oauth-openshift-5686c9c7dd-q7b8z\" (UID: \"d519ee60-0e44-4a4f-a408-977b544cf2ef\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.292442 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.531980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" event={"ID":"efb4d4b3-3446-47ec-b3e7-2ad34071d332","Type":"ContainerStarted","Data":"d526878a3daa4af26e7f1adcd9e270478984bc88f7ad665301813d74d7e50af2"} Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.537962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" event={"ID":"ad2f567f-6b35-4399-9f4f-63ab9640173c","Type":"ContainerStarted","Data":"bec8771ee7c288d2b5ed67a67489ee9c95880452f11edc158e7492648a9e0ad0"} Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.739364 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.739974 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:14:22 crc kubenswrapper[4735]: I0317 01:14:22.815768 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.113538 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z"] Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.546296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerStarted","Data":"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.549736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerStarted","Data":"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.551898 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" event={"ID":"ad2f567f-6b35-4399-9f4f-63ab9640173c","Type":"ContainerStarted","Data":"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.552243 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.555405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerStarted","Data":"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.556272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" event={"ID":"d519ee60-0e44-4a4f-a408-977b544cf2ef","Type":"ContainerStarted","Data":"e2a9f74ee05a75659c69c223e6ec39048f7618ade6f2ab937eb94a624ba8d2d2"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.558148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerStarted","Data":"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.559517 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" event={"ID":"efb4d4b3-3446-47ec-b3e7-2ad34071d332","Type":"ContainerStarted","Data":"194a4fbb87cb033eb0f6fa81e1dd98eff04de21a0d5ab46d316ae8e5cd028aeb"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.559668 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.562383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerStarted","Data":"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.567091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerStarted","Data":"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55"} Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.579494 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tdwt" podStartSLOduration=2.020974107 podStartE2EDuration="57.579478401s" podCreationTimestamp="2026-03-17 01:13:26 +0000 UTC" firstStartedPulling="2026-03-17 01:13:27.472744148 +0000 UTC m=+233.104977126" lastFinishedPulling="2026-03-17 01:14:23.031248442 +0000 UTC m=+288.663481420" observedRunningTime="2026-03-17 01:14:23.576733111 +0000 UTC m=+289.208966089" watchObservedRunningTime="2026-03-17 01:14:23.579478401 +0000 UTC m=+289.211711379" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.582914 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.598634 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bnkg8" podStartSLOduration=3.092101928 podStartE2EDuration="1m0.598616377s" podCreationTimestamp="2026-03-17 01:13:23 +0000 UTC" firstStartedPulling="2026-03-17 01:13:25.125832081 +0000 UTC m=+230.758065059" lastFinishedPulling="2026-03-17 01:14:22.63234652 +0000 UTC m=+288.264579508" observedRunningTime="2026-03-17 01:14:23.596927294 +0000 UTC m=+289.229160272" watchObservedRunningTime="2026-03-17 01:14:23.598616377 +0000 UTC m=+289.230849355" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.634521 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" podStartSLOduration=4.634501219 podStartE2EDuration="4.634501219s" podCreationTimestamp="2026-03-17 01:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:23.633218127 +0000 UTC m=+289.265451105" watchObservedRunningTime="2026-03-17 01:14:23.634501219 +0000 UTC m=+289.266734197" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.654303 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" podStartSLOduration=4.654284532 podStartE2EDuration="4.654284532s" podCreationTimestamp="2026-03-17 01:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:23.654152569 +0000 UTC m=+289.286385547" watchObservedRunningTime="2026-03-17 01:14:23.654284532 +0000 UTC m=+289.286517500" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.712044 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d4h2p" podStartSLOduration=4.542337805 podStartE2EDuration="59.71202557s" podCreationTimestamp="2026-03-17 01:13:24 +0000 UTC" firstStartedPulling="2026-03-17 01:13:27.490430292 +0000 UTC m=+233.122663270" lastFinishedPulling="2026-03-17 01:14:22.660118057 +0000 UTC m=+288.292351035" observedRunningTime="2026-03-17 01:14:23.693048158 +0000 UTC m=+289.325281136" watchObservedRunningTime="2026-03-17 01:14:23.71202557 +0000 UTC m=+289.344258538" Mar 17 01:14:23 crc kubenswrapper[4735]: I0317 01:14:23.733542 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.573802 4735 generic.go:334] "Generic (PLEG): container finished" podID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerID="bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55" exitCode=0 Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.573908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerDied","Data":"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55"} Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.575901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" event={"ID":"d519ee60-0e44-4a4f-a408-977b544cf2ef","Type":"ContainerStarted","Data":"8aa352071951638a2a7652bdbecb031d255f88431fb4f4f7d680d72619130c91"} Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.576535 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.581533 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.582017 4735 generic.go:334] "Generic (PLEG): container finished" podID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerID="1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1" exitCode=0 Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.582104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerDied","Data":"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1"} Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.585582 4735 generic.go:334] "Generic (PLEG): container finished" podID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerID="19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f" exitCode=0 Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.585828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerDied","Data":"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f"} Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.657142 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5686c9c7dd-q7b8z" podStartSLOduration=30.657125698 podStartE2EDuration="30.657125698s" podCreationTimestamp="2026-03-17 01:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:24.653762883 +0000 UTC m=+290.285995861" watchObservedRunningTime="2026-03-17 01:14:24.657125698 +0000 UTC m=+290.289358676" Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.981835 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:14:24 crc kubenswrapper[4735]: I0317 01:14:24.982139 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.039524 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-d4h2p" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="registry-server" probeResult="failure" output=< Mar 17 01:14:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:14:26 crc kubenswrapper[4735]: > Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.434208 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.434484 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.598672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerStarted","Data":"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172"} Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.600448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerStarted","Data":"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4"} Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.602880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerStarted","Data":"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286"} Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.640413 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-46vft" podStartSLOduration=3.343377893 podStartE2EDuration="1m1.640382721s" podCreationTimestamp="2026-03-17 01:13:25 +0000 UTC" firstStartedPulling="2026-03-17 01:13:27.477368065 +0000 UTC m=+233.109601053" lastFinishedPulling="2026-03-17 01:14:25.774372903 +0000 UTC m=+291.406605881" observedRunningTime="2026-03-17 01:14:26.618492094 +0000 UTC m=+292.250725072" watchObservedRunningTime="2026-03-17 01:14:26.640382721 +0000 UTC m=+292.272615699" Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.643082 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9pzj" podStartSLOduration=3.731179212 podStartE2EDuration="1m4.643047139s" podCreationTimestamp="2026-03-17 01:13:22 +0000 UTC" firstStartedPulling="2026-03-17 01:13:25.095738227 +0000 UTC m=+230.727971205" lastFinishedPulling="2026-03-17 01:14:26.007606164 +0000 UTC m=+291.639839132" observedRunningTime="2026-03-17 01:14:26.634466821 +0000 UTC m=+292.266699799" watchObservedRunningTime="2026-03-17 01:14:26.643047139 +0000 UTC m=+292.275280147" Mar 17 01:14:26 crc kubenswrapper[4735]: I0317 01:14:26.663969 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wd6n6" podStartSLOduration=3.029065776 podStartE2EDuration="1m1.663946431s" podCreationTimestamp="2026-03-17 01:13:25 +0000 UTC" firstStartedPulling="2026-03-17 01:13:27.554181751 +0000 UTC m=+233.186414729" lastFinishedPulling="2026-03-17 01:14:26.189062406 +0000 UTC m=+291.821295384" observedRunningTime="2026-03-17 01:14:26.659672062 +0000 UTC m=+292.291905040" watchObservedRunningTime="2026-03-17 01:14:26.663946431 +0000 UTC m=+292.296179409" Mar 17 01:14:27 crc kubenswrapper[4735]: I0317 01:14:27.490428 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tdwt" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="registry-server" probeResult="failure" output=< Mar 17 01:14:27 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:14:27 crc kubenswrapper[4735]: > Mar 17 01:14:32 crc kubenswrapper[4735]: I0317 01:14:32.790579 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.170776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.170847 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.234549 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.478557 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.478593 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.527514 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.694057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:33 crc kubenswrapper[4735]: I0317 01:14:33.698544 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:34 crc kubenswrapper[4735]: I0317 01:14:34.227591 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.035049 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.105679 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.655208 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bnkg8" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="registry-server" containerID="cri-o://cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad" gracePeriod=2 Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.680548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.680685 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.748932 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.982726 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:14:35 crc kubenswrapper[4735]: I0317 01:14:35.983167 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.034495 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.034798 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9pzj" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="registry-server" containerID="cri-o://d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4" gracePeriod=2 Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.043836 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.235526 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.325996 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content\") pod \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.326066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h67q9\" (UniqueName: \"kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9\") pod \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.326120 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities\") pod \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\" (UID: \"0d63aa4e-027e-4486-8b30-b7b583c57b3a\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.327251 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities" (OuterVolumeSpecName: "utilities") pod "0d63aa4e-027e-4486-8b30-b7b583c57b3a" (UID: "0d63aa4e-027e-4486-8b30-b7b583c57b3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.330677 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9" (OuterVolumeSpecName: "kube-api-access-h67q9") pod "0d63aa4e-027e-4486-8b30-b7b583c57b3a" (UID: "0d63aa4e-027e-4486-8b30-b7b583c57b3a"). InnerVolumeSpecName "kube-api-access-h67q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.374721 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d63aa4e-027e-4486-8b30-b7b583c57b3a" (UID: "0d63aa4e-027e-4486-8b30-b7b583c57b3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.427739 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.427811 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63aa4e-027e-4486-8b30-b7b583c57b3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.427831 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h67q9\" (UniqueName: \"kubernetes.io/projected/0d63aa4e-027e-4486-8b30-b7b583c57b3a-kube-api-access-h67q9\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.480708 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.525062 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.568237 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.665989 4735 generic.go:334] "Generic (PLEG): container finished" podID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerID="d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4" exitCode=0 Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.666087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerDied","Data":"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4"} Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.666131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9pzj" event={"ID":"9766c85a-73b3-42ba-90c1-c4c93493d138","Type":"ContainerDied","Data":"2a14b9e26e66a1a58fe3e6ebcb93c5d2c110c37e3c59ebaaaf62453f43c45caf"} Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.666168 4735 scope.go:117] "RemoveContainer" containerID="d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.666516 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9pzj" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.671430 4735 generic.go:334] "Generic (PLEG): container finished" podID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerID="cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad" exitCode=0 Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.671492 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnkg8" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.671552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerDied","Data":"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad"} Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.671617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnkg8" event={"ID":"0d63aa4e-027e-4486-8b30-b7b583c57b3a","Type":"ContainerDied","Data":"992fcb6bdcebe3b48efe5b88cd4781a9d3ec92fbfe97fc3346b5f09aaf59f6a3"} Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.697711 4735 scope.go:117] "RemoveContainer" containerID="bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.731820 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities\") pod \"9766c85a-73b3-42ba-90c1-c4c93493d138\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.732113 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4vw\" (UniqueName: \"kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw\") pod \"9766c85a-73b3-42ba-90c1-c4c93493d138\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.732192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content\") pod \"9766c85a-73b3-42ba-90c1-c4c93493d138\" (UID: \"9766c85a-73b3-42ba-90c1-c4c93493d138\") " Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.735072 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities" (OuterVolumeSpecName: "utilities") pod "9766c85a-73b3-42ba-90c1-c4c93493d138" (UID: "9766c85a-73b3-42ba-90c1-c4c93493d138"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.740405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw" (OuterVolumeSpecName: "kube-api-access-mm4vw") pod "9766c85a-73b3-42ba-90c1-c4c93493d138" (UID: "9766c85a-73b3-42ba-90c1-c4c93493d138"). InnerVolumeSpecName "kube-api-access-mm4vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.745505 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.745679 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.756597 4735 scope.go:117] "RemoveContainer" containerID="ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.760938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.763009 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bnkg8"] Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.783147 4735 scope.go:117] "RemoveContainer" containerID="d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.783622 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4\": container with ID starting with d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4 not found: ID does not exist" containerID="d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.783665 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4"} err="failed to get container status \"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4\": rpc error: code = NotFound desc = could not find container \"d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4\": container with ID starting with d72a34b609e158af94236ec7fc966c195817931866315530ee0e0be6aafacfe4 not found: ID does not exist" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.783695 4735 scope.go:117] "RemoveContainer" containerID="bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.784089 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55\": container with ID starting with bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55 not found: ID does not exist" containerID="bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.784110 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55"} err="failed to get container status \"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55\": rpc error: code = NotFound desc = could not find container \"bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55\": container with ID starting with bf37fc654ab412bdf8aa0cd354569daadb4b07a72aca0d7f95c6306b0c35ba55 not found: ID does not exist" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.784127 4735 scope.go:117] "RemoveContainer" containerID="ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.784554 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f\": container with ID starting with ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f not found: ID does not exist" containerID="ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.784604 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f"} err="failed to get container status \"ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f\": rpc error: code = NotFound desc = could not find container \"ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f\": container with ID starting with ec49e52ae650c5b98f4ba58b7c557cb74ad886c207b17b358735bdbd26a3f65f not found: ID does not exist" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.784622 4735 scope.go:117] "RemoveContainer" containerID="cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.803105 4735 scope.go:117] "RemoveContainer" containerID="ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.810592 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9766c85a-73b3-42ba-90c1-c4c93493d138" (UID: "9766c85a-73b3-42ba-90c1-c4c93493d138"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.836851 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4vw\" (UniqueName: \"kubernetes.io/projected/9766c85a-73b3-42ba-90c1-c4c93493d138-kube-api-access-mm4vw\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.837306 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.837461 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9766c85a-73b3-42ba-90c1-c4c93493d138-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.841876 4735 scope.go:117] "RemoveContainer" containerID="539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.855130 4735 scope.go:117] "RemoveContainer" containerID="cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.855517 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad\": container with ID starting with cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad not found: ID does not exist" containerID="cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.855616 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad"} err="failed to get container status \"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad\": rpc error: code = NotFound desc = could not find container \"cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad\": container with ID starting with cac170057d1d435902820a61f303022171225440d0fd43d4d26d6c550c003dad not found: ID does not exist" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.855699 4735 scope.go:117] "RemoveContainer" containerID="ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.856041 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49\": container with ID starting with ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49 not found: ID does not exist" containerID="ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.856067 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49"} err="failed to get container status \"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49\": rpc error: code = NotFound desc = could not find container \"ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49\": container with ID starting with ed062c479c1eddd6fff6c57f080835319e4e2320767af4b42a44220da81d2a49 not found: ID does not exist" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.856086 4735 scope.go:117] "RemoveContainer" containerID="539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9" Mar 17 01:14:36 crc kubenswrapper[4735]: E0317 01:14:36.856278 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9\": container with ID starting with 539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9 not found: ID does not exist" containerID="539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9" Mar 17 01:14:36 crc kubenswrapper[4735]: I0317 01:14:36.856303 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9"} err="failed to get container status \"539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9\": rpc error: code = NotFound desc = could not find container \"539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9\": container with ID starting with 539b5f0a5699154ae8ba1985b1a75c95cbb5b6a70687f564b127020f997f02e9 not found: ID does not exist" Mar 17 01:14:37 crc kubenswrapper[4735]: I0317 01:14:37.008967 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:14:37 crc kubenswrapper[4735]: I0317 01:14:37.016022 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9pzj"] Mar 17 01:14:37 crc kubenswrapper[4735]: I0317 01:14:37.084656 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" path="/var/lib/kubelet/pods/0d63aa4e-027e-4486-8b30-b7b583c57b3a/volumes" Mar 17 01:14:37 crc kubenswrapper[4735]: I0317 01:14:37.086027 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" path="/var/lib/kubelet/pods/9766c85a-73b3-42ba-90c1-c4c93493d138/volumes" Mar 17 01:14:38 crc kubenswrapper[4735]: I0317 01:14:38.428635 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:14:38 crc kubenswrapper[4735]: I0317 01:14:38.698935 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wd6n6" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="registry-server" containerID="cri-o://9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286" gracePeriod=2 Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.334200 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.377203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities\") pod \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.377656 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content\") pod \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.377717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfcd\" (UniqueName: \"kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd\") pod \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\" (UID: \"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6\") " Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.378194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities" (OuterVolumeSpecName: "utilities") pod "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" (UID: "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.387160 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd" (OuterVolumeSpecName: "kube-api-access-2xfcd") pod "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" (UID: "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6"). InnerVolumeSpecName "kube-api-access-2xfcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.428496 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" (UID: "00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.479074 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfcd\" (UniqueName: \"kubernetes.io/projected/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-kube-api-access-2xfcd\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.479113 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.479126 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.533238 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.533477 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" podUID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" containerName="controller-manager" containerID="cri-o://194a4fbb87cb033eb0f6fa81e1dd98eff04de21a0d5ab46d316ae8e5cd028aeb" gracePeriod=30 Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.632527 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.632752 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" podUID="ad2f567f-6b35-4399-9f4f-63ab9640173c" containerName="route-controller-manager" containerID="cri-o://3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f" gracePeriod=30 Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.714299 4735 generic.go:334] "Generic (PLEG): container finished" podID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerID="9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286" exitCode=0 Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.714366 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd6n6" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.714388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerDied","Data":"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286"} Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.714423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd6n6" event={"ID":"00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6","Type":"ContainerDied","Data":"505f2b91f5294f96003a74321b722add82fdbc4a6368f5da9b2282f45e045e72"} Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.714444 4735 scope.go:117] "RemoveContainer" containerID="9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.726073 4735 generic.go:334] "Generic (PLEG): container finished" podID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" containerID="194a4fbb87cb033eb0f6fa81e1dd98eff04de21a0d5ab46d316ae8e5cd028aeb" exitCode=0 Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.726113 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" event={"ID":"efb4d4b3-3446-47ec-b3e7-2ad34071d332","Type":"ContainerDied","Data":"194a4fbb87cb033eb0f6fa81e1dd98eff04de21a0d5ab46d316ae8e5cd028aeb"} Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.742580 4735 scope.go:117] "RemoveContainer" containerID="1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.745732 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.753271 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd6n6"] Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.761529 4735 scope.go:117] "RemoveContainer" containerID="a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.792146 4735 scope.go:117] "RemoveContainer" containerID="9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286" Mar 17 01:14:39 crc kubenswrapper[4735]: E0317 01:14:39.795035 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286\": container with ID starting with 9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286 not found: ID does not exist" containerID="9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.795085 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286"} err="failed to get container status \"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286\": rpc error: code = NotFound desc = could not find container \"9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286\": container with ID starting with 9076badc834ae3172ea2cc85e0c59e5e7d733563cdf02e4456d0169cd149c286 not found: ID does not exist" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.795110 4735 scope.go:117] "RemoveContainer" containerID="1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1" Mar 17 01:14:39 crc kubenswrapper[4735]: E0317 01:14:39.795390 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1\": container with ID starting with 1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1 not found: ID does not exist" containerID="1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.795420 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1"} err="failed to get container status \"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1\": rpc error: code = NotFound desc = could not find container \"1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1\": container with ID starting with 1a049bb24c4630036decd01b6351b0776284915d5be33a0a9a40bad8cbc013c1 not found: ID does not exist" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.795433 4735 scope.go:117] "RemoveContainer" containerID="a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753" Mar 17 01:14:39 crc kubenswrapper[4735]: E0317 01:14:39.795849 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753\": container with ID starting with a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753 not found: ID does not exist" containerID="a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753" Mar 17 01:14:39 crc kubenswrapper[4735]: I0317 01:14:39.795916 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753"} err="failed to get container status \"a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753\": rpc error: code = NotFound desc = could not find container \"a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753\": container with ID starting with a0d70a03c76bbbee4a432f0e4a4d5c2533f51c9f740c92526436672e92c5e753 not found: ID does not exist" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.133103 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.137613 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca\") pod \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289350 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert\") pod \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert\") pod \"ad2f567f-6b35-4399-9f4f-63ab9640173c\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7xmg\" (UniqueName: \"kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg\") pod \"ad2f567f-6b35-4399-9f4f-63ab9640173c\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289488 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config\") pod \"ad2f567f-6b35-4399-9f4f-63ab9640173c\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config\") pod \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289588 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqn7c\" (UniqueName: \"kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c\") pod \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289639 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca\") pod \"ad2f567f-6b35-4399-9f4f-63ab9640173c\" (UID: \"ad2f567f-6b35-4399-9f4f-63ab9640173c\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.289671 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles\") pod \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\" (UID: \"efb4d4b3-3446-47ec-b3e7-2ad34071d332\") " Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.291276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config" (OuterVolumeSpecName: "config") pod "efb4d4b3-3446-47ec-b3e7-2ad34071d332" (UID: "efb4d4b3-3446-47ec-b3e7-2ad34071d332"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.291578 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad2f567f-6b35-4399-9f4f-63ab9640173c" (UID: "ad2f567f-6b35-4399-9f4f-63ab9640173c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.291610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "efb4d4b3-3446-47ec-b3e7-2ad34071d332" (UID: "efb4d4b3-3446-47ec-b3e7-2ad34071d332"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.291611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca" (OuterVolumeSpecName: "client-ca") pod "efb4d4b3-3446-47ec-b3e7-2ad34071d332" (UID: "efb4d4b3-3446-47ec-b3e7-2ad34071d332"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.291641 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config" (OuterVolumeSpecName: "config") pod "ad2f567f-6b35-4399-9f4f-63ab9640173c" (UID: "ad2f567f-6b35-4399-9f4f-63ab9640173c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.294405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "efb4d4b3-3446-47ec-b3e7-2ad34071d332" (UID: "efb4d4b3-3446-47ec-b3e7-2ad34071d332"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.294432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c" (OuterVolumeSpecName: "kube-api-access-cqn7c") pod "efb4d4b3-3446-47ec-b3e7-2ad34071d332" (UID: "efb4d4b3-3446-47ec-b3e7-2ad34071d332"). InnerVolumeSpecName "kube-api-access-cqn7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.294474 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad2f567f-6b35-4399-9f4f-63ab9640173c" (UID: "ad2f567f-6b35-4399-9f4f-63ab9640173c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.297058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg" (OuterVolumeSpecName: "kube-api-access-k7xmg") pod "ad2f567f-6b35-4399-9f4f-63ab9640173c" (UID: "ad2f567f-6b35-4399-9f4f-63ab9640173c"). InnerVolumeSpecName "kube-api-access-k7xmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390631 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390658 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqn7c\" (UniqueName: \"kubernetes.io/projected/efb4d4b3-3446-47ec-b3e7-2ad34071d332-kube-api-access-cqn7c\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390669 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390679 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390689 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efb4d4b3-3446-47ec-b3e7-2ad34071d332-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390698 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efb4d4b3-3446-47ec-b3e7-2ad34071d332-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390707 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2f567f-6b35-4399-9f4f-63ab9640173c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390716 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7xmg\" (UniqueName: \"kubernetes.io/projected/ad2f567f-6b35-4399-9f4f-63ab9640173c-kube-api-access-k7xmg\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.390726 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2f567f-6b35-4399-9f4f-63ab9640173c-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.735561 4735 generic.go:334] "Generic (PLEG): container finished" podID="ad2f567f-6b35-4399-9f4f-63ab9640173c" containerID="3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f" exitCode=0 Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.735671 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.735680 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" event={"ID":"ad2f567f-6b35-4399-9f4f-63ab9640173c","Type":"ContainerDied","Data":"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f"} Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.736010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq" event={"ID":"ad2f567f-6b35-4399-9f4f-63ab9640173c","Type":"ContainerDied","Data":"bec8771ee7c288d2b5ed67a67489ee9c95880452f11edc158e7492648a9e0ad0"} Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.736081 4735 scope.go:117] "RemoveContainer" containerID="3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.739658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" event={"ID":"efb4d4b3-3446-47ec-b3e7-2ad34071d332","Type":"ContainerDied","Data":"d526878a3daa4af26e7f1adcd9e270478984bc88f7ad665301813d74d7e50af2"} Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.739783 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dcdbcd69-87x7s" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.766819 4735 scope.go:117] "RemoveContainer" containerID="3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.767488 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f\": container with ID starting with 3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f not found: ID does not exist" containerID="3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.767580 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f"} err="failed to get container status \"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f\": rpc error: code = NotFound desc = could not find container \"3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f\": container with ID starting with 3dd57ab64eecb965c50db52484ffe117b68944b86eebddd9c7e988fe9c3d8b3f not found: ID does not exist" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.767651 4735 scope.go:117] "RemoveContainer" containerID="194a4fbb87cb033eb0f6fa81e1dd98eff04de21a0d5ab46d316ae8e5cd028aeb" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.795539 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.801688 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54d64694d7-65tmq"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.812241 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.815422 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64dcdbcd69-87x7s"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.822645 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.823028 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tdwt" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="registry-server" containerID="cri-o://18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37" gracePeriod=2 Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.977804 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf"] Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978410 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978432 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978450 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978458 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978473 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978483 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978493 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978502 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978514 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978522 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" containerName="controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978543 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" containerName="controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978555 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978564 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978576 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2f567f-6b35-4399-9f4f-63ab9640173c" containerName="route-controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978584 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f567f-6b35-4399-9f4f-63ab9640173c" containerName="route-controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978594 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978601 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978614 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978622 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="extract-content" Mar 17 01:14:40 crc kubenswrapper[4735]: E0317 01:14:40.978632 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978640 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="extract-utilities" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978755 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d63aa4e-027e-4486-8b30-b7b583c57b3a" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978769 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9766c85a-73b3-42ba-90c1-c4c93493d138" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978783 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" containerName="registry-server" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978794 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2f567f-6b35-4399-9f4f-63ab9640173c" containerName="route-controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.978806 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" containerName="controller-manager" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.979286 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.981373 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.981923 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.983181 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.984250 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.984406 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.984542 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.985087 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.986345 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.986599 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.986990 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.987108 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.988200 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.988369 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.989072 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.990504 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf"] Mar 17 01:14:40 crc kubenswrapper[4735]: I0317 01:14:40.993071 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.022847 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw"] Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.088030 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6" path="/var/lib/kubelet/pods/00fa0ed7-e7b2-41d6-9583-831b8bb7d6e6/volumes" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.088814 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2f567f-6b35-4399-9f4f-63ab9640173c" path="/var/lib/kubelet/pods/ad2f567f-6b35-4399-9f4f-63ab9640173c/volumes" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.089297 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb4d4b3-3446-47ec-b3e7-2ad34071d332" path="/var/lib/kubelet/pods/efb4d4b3-3446-47ec-b3e7-2ad34071d332/volumes" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkqk\" (UniqueName: \"kubernetes.io/projected/0bf5c913-930c-4718-aed3-f8dde75f3459-kube-api-access-cfkqk\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-proxy-ca-bundles\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf5c913-930c-4718-aed3-f8dde75f3459-serving-cert\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099497 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-client-ca\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2cd7eb4-0063-47b3-bbde-8e30473851f2-serving-cert\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh85x\" (UniqueName: \"kubernetes.io/projected/c2cd7eb4-0063-47b3-bbde-8e30473851f2-kube-api-access-mh85x\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-client-ca\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099597 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-config\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.099622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-config\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-client-ca\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2cd7eb4-0063-47b3-bbde-8e30473851f2-serving-cert\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh85x\" (UniqueName: \"kubernetes.io/projected/c2cd7eb4-0063-47b3-bbde-8e30473851f2-kube-api-access-mh85x\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201270 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-client-ca\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-config\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201307 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-config\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkqk\" (UniqueName: \"kubernetes.io/projected/0bf5c913-930c-4718-aed3-f8dde75f3459-kube-api-access-cfkqk\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201679 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-proxy-ca-bundles\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.201773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf5c913-930c-4718-aed3-f8dde75f3459-serving-cert\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.202990 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-config\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.203478 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-client-ca\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.203746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-client-ca\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.203772 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2cd7eb4-0063-47b3-bbde-8e30473851f2-proxy-ca-bundles\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.204452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf5c913-930c-4718-aed3-f8dde75f3459-config\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.212365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2cd7eb4-0063-47b3-bbde-8e30473851f2-serving-cert\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.216373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf5c913-930c-4718-aed3-f8dde75f3459-serving-cert\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.220844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh85x\" (UniqueName: \"kubernetes.io/projected/c2cd7eb4-0063-47b3-bbde-8e30473851f2-kube-api-access-mh85x\") pod \"controller-manager-7d465c5f6f-wsgwf\" (UID: \"c2cd7eb4-0063-47b3-bbde-8e30473851f2\") " pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.227357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkqk\" (UniqueName: \"kubernetes.io/projected/0bf5c913-930c-4718-aed3-f8dde75f3459-kube-api-access-cfkqk\") pod \"route-controller-manager-f4dd5b74c-q7hqw\" (UID: \"0bf5c913-930c-4718-aed3-f8dde75f3459\") " pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.282392 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.301158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.311279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.403730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities\") pod \"9fb14281-df17-4753-ba53-e292dcb071fa\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.403831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content\") pod \"9fb14281-df17-4753-ba53-e292dcb071fa\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.404097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhk75\" (UniqueName: \"kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75\") pod \"9fb14281-df17-4753-ba53-e292dcb071fa\" (UID: \"9fb14281-df17-4753-ba53-e292dcb071fa\") " Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.405179 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities" (OuterVolumeSpecName: "utilities") pod "9fb14281-df17-4753-ba53-e292dcb071fa" (UID: "9fb14281-df17-4753-ba53-e292dcb071fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.411152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75" (OuterVolumeSpecName: "kube-api-access-nhk75") pod "9fb14281-df17-4753-ba53-e292dcb071fa" (UID: "9fb14281-df17-4753-ba53-e292dcb071fa"). InnerVolumeSpecName "kube-api-access-nhk75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.507536 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhk75\" (UniqueName: \"kubernetes.io/projected/9fb14281-df17-4753-ba53-e292dcb071fa-kube-api-access-nhk75\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.507775 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.562139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fb14281-df17-4753-ba53-e292dcb071fa" (UID: "9fb14281-df17-4753-ba53-e292dcb071fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.608699 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb14281-df17-4753-ba53-e292dcb071fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.754501 4735 generic.go:334] "Generic (PLEG): container finished" podID="9fb14281-df17-4753-ba53-e292dcb071fa" containerID="18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37" exitCode=0 Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.754559 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tdwt" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.754608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerDied","Data":"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37"} Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.754651 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tdwt" event={"ID":"9fb14281-df17-4753-ba53-e292dcb071fa","Type":"ContainerDied","Data":"16564cd19dc9fea5dd6eba7f0246720e7f76946946ba876d74348dfd05b5acb5"} Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.754675 4735 scope.go:117] "RemoveContainer" containerID="18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.771005 4735 scope.go:117] "RemoveContainer" containerID="6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.786558 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.790067 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tdwt"] Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.809841 4735 scope.go:117] "RemoveContainer" containerID="f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.849245 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw"] Mar 17 01:14:41 crc kubenswrapper[4735]: W0317 01:14:41.858354 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf5c913_930c_4718_aed3_f8dde75f3459.slice/crio-a6088651ebe5f4eb0044fb35a054afa60e5e21bd6e9da281c3899057b992be0a WatchSource:0}: Error finding container a6088651ebe5f4eb0044fb35a054afa60e5e21bd6e9da281c3899057b992be0a: Status 404 returned error can't find the container with id a6088651ebe5f4eb0044fb35a054afa60e5e21bd6e9da281c3899057b992be0a Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.862293 4735 scope.go:117] "RemoveContainer" containerID="18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37" Mar 17 01:14:41 crc kubenswrapper[4735]: E0317 01:14:41.862824 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37\": container with ID starting with 18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37 not found: ID does not exist" containerID="18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.862899 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37"} err="failed to get container status \"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37\": rpc error: code = NotFound desc = could not find container \"18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37\": container with ID starting with 18b6686a540845f89361d1e0540f3379eaae0fa1b9a1bf9062948eac97074c37 not found: ID does not exist" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.862938 4735 scope.go:117] "RemoveContainer" containerID="6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605" Mar 17 01:14:41 crc kubenswrapper[4735]: E0317 01:14:41.863405 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605\": container with ID starting with 6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605 not found: ID does not exist" containerID="6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.863445 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605"} err="failed to get container status \"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605\": rpc error: code = NotFound desc = could not find container \"6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605\": container with ID starting with 6b7f2bb4f5b86a827789af5dc1cd54f92b64fe09a4a6dc389268ebea634f9605 not found: ID does not exist" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.863474 4735 scope.go:117] "RemoveContainer" containerID="f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6" Mar 17 01:14:41 crc kubenswrapper[4735]: E0317 01:14:41.864239 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6\": container with ID starting with f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6 not found: ID does not exist" containerID="f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.864259 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6"} err="failed to get container status \"f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6\": rpc error: code = NotFound desc = could not find container \"f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6\": container with ID starting with f8d2b075816d9fe5bf80d3ad3be8add61051254526bd03007dc9883e0a8c31b6 not found: ID does not exist" Mar 17 01:14:41 crc kubenswrapper[4735]: I0317 01:14:41.899614 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf"] Mar 17 01:14:41 crc kubenswrapper[4735]: W0317 01:14:41.908208 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2cd7eb4_0063_47b3_bbde_8e30473851f2.slice/crio-95184ba89a34d264ee3e77a2519030a5c54498126109812ffc51ba6ae7d699ba WatchSource:0}: Error finding container 95184ba89a34d264ee3e77a2519030a5c54498126109812ffc51ba6ae7d699ba: Status 404 returned error can't find the container with id 95184ba89a34d264ee3e77a2519030a5c54498126109812ffc51ba6ae7d699ba Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.767406 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" event={"ID":"0bf5c913-930c-4718-aed3-f8dde75f3459","Type":"ContainerStarted","Data":"d949c8321495c5b4396a772d9001da69e7e4f8368d6482bd6819574a958babec"} Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.767698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" event={"ID":"0bf5c913-930c-4718-aed3-f8dde75f3459","Type":"ContainerStarted","Data":"a6088651ebe5f4eb0044fb35a054afa60e5e21bd6e9da281c3899057b992be0a"} Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.767719 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.784631 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.784684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" event={"ID":"c2cd7eb4-0063-47b3-bbde-8e30473851f2","Type":"ContainerStarted","Data":"50a0c7c0fe741671321b6ee0b09dc76638c09542d17ef9f3f8e09ee6926604f2"} Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.784711 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" event={"ID":"c2cd7eb4-0063-47b3-bbde-8e30473851f2","Type":"ContainerStarted","Data":"95184ba89a34d264ee3e77a2519030a5c54498126109812ffc51ba6ae7d699ba"} Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.784882 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.792426 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f4dd5b74c-q7hqw" podStartSLOduration=3.792411281 podStartE2EDuration="3.792411281s" podCreationTimestamp="2026-03-17 01:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:42.789350504 +0000 UTC m=+308.421583472" watchObservedRunningTime="2026-03-17 01:14:42.792411281 +0000 UTC m=+308.424644259" Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.797007 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" Mar 17 01:14:42 crc kubenswrapper[4735]: I0317 01:14:42.821942 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d465c5f6f-wsgwf" podStartSLOduration=3.821926272 podStartE2EDuration="3.821926272s" podCreationTimestamp="2026-03-17 01:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:14:42.817686964 +0000 UTC m=+308.449919952" watchObservedRunningTime="2026-03-17 01:14:42.821926272 +0000 UTC m=+308.454159250" Mar 17 01:14:43 crc kubenswrapper[4735]: I0317 01:14:43.079136 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" path="/var/lib/kubelet/pods/9fb14281-df17-4753-ba53-e292dcb071fa/volumes" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.729065 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.732038 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="registry-server" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.732102 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="registry-server" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.732145 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="extract-utilities" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.732160 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="extract-utilities" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.732189 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="extract-content" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.732205 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="extract-content" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.733411 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb14281-df17-4753-ba53-e292dcb071fa" containerName="registry-server" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.734272 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.735198 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554" gracePeriod=15 Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.735690 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.735849 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a" gracePeriod=15 Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.736578 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb" gracePeriod=15 Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.736693 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b" gracePeriod=15 Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.736782 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79" gracePeriod=15 Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.753969 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.755386 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.757900 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.757932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.757947 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.757968 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.757979 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.757996 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758060 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758110 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758156 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758182 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758195 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758212 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758225 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758267 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758280 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758295 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758309 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758574 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758594 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758609 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758630 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758646 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758661 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758677 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 01:14:44 crc kubenswrapper[4735]: E0317 01:14:44.758896 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.758931 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.759108 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.759129 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.807770 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.856462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.856927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.857478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.857619 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.857767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.857914 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.858041 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.858170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960594 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960933 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.960948 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.961024 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.961801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.961963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.962018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.962066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:44 crc kubenswrapper[4735]: I0317 01:14:44.962165 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.079185 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.079931 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.107819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:14:45 crc kubenswrapper[4735]: W0317 01:14:45.142585 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4b03b1650db7ca6bf973bce4a820c522e1b443b9bf91ca66b6787a0859e5e83d WatchSource:0}: Error finding container 4b03b1650db7ca6bf973bce4a820c522e1b443b9bf91ca66b6787a0859e5e83d: Status 404 returned error can't find the container with id 4b03b1650db7ca6bf973bce4a820c522e1b443b9bf91ca66b6787a0859e5e83d Mar 17 01:14:45 crc kubenswrapper[4735]: E0317 01:14:45.147841 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d7becc7b8855c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:14:45.147100508 +0000 UTC m=+310.779333526,LastTimestamp:2026-03-17 01:14:45.147100508 +0000 UTC m=+310.779333526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.821360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"645cc4a7987db8ab1e24af04b4cf5a310941ae5029ab3defb7e91ad369a46cd0"} Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.821412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4b03b1650db7ca6bf973bce4a820c522e1b443b9bf91ca66b6787a0859e5e83d"} Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.821967 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.823522 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.824604 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.825305 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a" exitCode=0 Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.825327 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b" exitCode=0 Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.825335 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79" exitCode=0 Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.825342 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb" exitCode=2 Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.825410 4735 scope.go:117] "RemoveContainer" containerID="4424938c4f91274dc597cd7fa32ef6c336280946145c70e0d18057bb9a76de59" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.827145 4735 generic.go:334] "Generic (PLEG): container finished" podID="5c97756b-7a02-446c-b81c-efee6a00f860" containerID="14ef6ad67bcc23c39baeb4c07257091f62b41367975b819ec68b83ee69afe061" exitCode=0 Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.827196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c97756b-7a02-446c-b81c-efee6a00f860","Type":"ContainerDied","Data":"14ef6ad67bcc23c39baeb4c07257091f62b41367975b819ec68b83ee69afe061"} Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.828001 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:45 crc kubenswrapper[4735]: I0317 01:14:45.828340 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:46 crc kubenswrapper[4735]: I0317 01:14:46.837603 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.138850 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.139979 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.140648 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.141109 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.141355 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.266778 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.267302 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.267638 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.268009 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.293896 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.293973 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.293980 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294048 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294188 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294362 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294395 4735 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.294415 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.395759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock\") pod \"5c97756b-7a02-446c-b81c-efee6a00f860\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.395912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir\") pod \"5c97756b-7a02-446c-b81c-efee6a00f860\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.395948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access\") pod \"5c97756b-7a02-446c-b81c-efee6a00f860\" (UID: \"5c97756b-7a02-446c-b81c-efee6a00f860\") " Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.396230 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c97756b-7a02-446c-b81c-efee6a00f860" (UID: "5c97756b-7a02-446c-b81c-efee6a00f860"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.396308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock" (OuterVolumeSpecName: "var-lock") pod "5c97756b-7a02-446c-b81c-efee6a00f860" (UID: "5c97756b-7a02-446c-b81c-efee6a00f860"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.401023 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c97756b-7a02-446c-b81c-efee6a00f860" (UID: "5c97756b-7a02-446c-b81c-efee6a00f860"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.497962 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.498009 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c97756b-7a02-446c-b81c-efee6a00f860-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.498029 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c97756b-7a02-446c-b81c-efee6a00f860-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.846500 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.846419 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c97756b-7a02-446c-b81c-efee6a00f860","Type":"ContainerDied","Data":"4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4"} Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.847839 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bbe49d54ae5524fd3ec418744f1711050016eef99f87f8423f958e6584eb9f4" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.851665 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.852402 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554" exitCode=0 Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.852512 4735 scope.go:117] "RemoveContainer" containerID="58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.852679 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.870670 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.871319 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.871846 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.879595 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.880090 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.880343 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.886103 4735 scope.go:117] "RemoveContainer" containerID="d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.908104 4735 scope.go:117] "RemoveContainer" containerID="bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.927154 4735 scope.go:117] "RemoveContainer" containerID="fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.944563 4735 scope.go:117] "RemoveContainer" containerID="48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.966418 4735 scope.go:117] "RemoveContainer" containerID="8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.987285 4735 scope.go:117] "RemoveContainer" containerID="58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.987694 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\": container with ID starting with 58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a not found: ID does not exist" containerID="58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.987743 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a"} err="failed to get container status \"58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\": rpc error: code = NotFound desc = could not find container \"58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a\": container with ID starting with 58c30d3863c0b9a0175a3d52f729401d721bdf8c1e1ad49fae4e5e1e1595f81a not found: ID does not exist" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.987770 4735 scope.go:117] "RemoveContainer" containerID="d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.990535 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\": container with ID starting with d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b not found: ID does not exist" containerID="d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.990568 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b"} err="failed to get container status \"d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\": rpc error: code = NotFound desc = could not find container \"d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b\": container with ID starting with d19b65e24d1dc3f2a77661a508d3eb089d8244f9d745ee3f7b88fb8102965b8b not found: ID does not exist" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.990620 4735 scope.go:117] "RemoveContainer" containerID="bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.991035 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\": container with ID starting with bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79 not found: ID does not exist" containerID="bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991053 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79"} err="failed to get container status \"bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\": rpc error: code = NotFound desc = could not find container \"bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79\": container with ID starting with bd75b12572fbf094830e1915632bf152f2dc68fa0c56b8167b57020e4e97bb79 not found: ID does not exist" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991067 4735 scope.go:117] "RemoveContainer" containerID="fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.991330 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\": container with ID starting with fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb not found: ID does not exist" containerID="fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991350 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb"} err="failed to get container status \"fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\": rpc error: code = NotFound desc = could not find container \"fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb\": container with ID starting with fb99180c5fd0b8b85a82bba795475a57951a4c08a1984ec1684b27da4bf77ddb not found: ID does not exist" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991362 4735 scope.go:117] "RemoveContainer" containerID="48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.991661 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\": container with ID starting with 48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554 not found: ID does not exist" containerID="48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991682 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554"} err="failed to get container status \"48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\": rpc error: code = NotFound desc = could not find container \"48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554\": container with ID starting with 48aa63af223c91b49153401d5d53afd4ec764509a8c92dc76317eacc749c7554 not found: ID does not exist" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.991699 4735 scope.go:117] "RemoveContainer" containerID="8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2" Mar 17 01:14:47 crc kubenswrapper[4735]: E0317 01:14:47.992180 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\": container with ID starting with 8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2 not found: ID does not exist" containerID="8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2" Mar 17 01:14:47 crc kubenswrapper[4735]: I0317 01:14:47.992246 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2"} err="failed to get container status \"8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\": rpc error: code = NotFound desc = could not find container \"8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2\": container with ID starting with 8efc856e751d9d989737731d6d8644f65905ed5e9a9215645b83dc7830a4b3f2 not found: ID does not exist" Mar 17 01:14:49 crc kubenswrapper[4735]: I0317 01:14:49.083752 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 17 01:14:49 crc kubenswrapper[4735]: E0317 01:14:49.129132 4735 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" volumeName="registry-storage" Mar 17 01:14:49 crc kubenswrapper[4735]: E0317 01:14:49.555028 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d7becc7b8855c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 01:14:45.147100508 +0000 UTC m=+310.779333526,LastTimestamp:2026-03-17 01:14:45.147100508 +0000 UTC m=+310.779333526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.000455 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.001318 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.001948 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.002626 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.003501 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:50 crc kubenswrapper[4735]: I0317 01:14:50.003677 4735 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.004368 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.206977 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Mar 17 01:14:50 crc kubenswrapper[4735]: E0317 01:14:50.607417 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Mar 17 01:14:51 crc kubenswrapper[4735]: E0317 01:14:51.409037 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Mar 17 01:14:53 crc kubenswrapper[4735]: E0317 01:14:53.010619 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Mar 17 01:14:55 crc kubenswrapper[4735]: I0317 01:14:55.077554 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:55 crc kubenswrapper[4735]: I0317 01:14:55.078631 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.072345 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.073852 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.074491 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.100957 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.101003 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:56 crc kubenswrapper[4735]: E0317 01:14:56.101431 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.102139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:56 crc kubenswrapper[4735]: W0317 01:14:56.134801 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-30da15f88e34908e9fd17597ed1993aa8f7c1c456a75b9511180645215f79c95 WatchSource:0}: Error finding container 30da15f88e34908e9fd17597ed1993aa8f7c1c456a75b9511180645215f79c95: Status 404 returned error can't find the container with id 30da15f88e34908e9fd17597ed1993aa8f7c1c456a75b9511180645215f79c95 Mar 17 01:14:56 crc kubenswrapper[4735]: E0317 01:14:56.212277 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="6.4s" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.930000 4735 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d43793f84179fce75f6646f0647522bc434898074ab281b1785f0502af0e7b5c" exitCode=0 Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.930064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d43793f84179fce75f6646f0647522bc434898074ab281b1785f0502af0e7b5c"} Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.930111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"30da15f88e34908e9fd17597ed1993aa8f7c1c456a75b9511180645215f79c95"} Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.930667 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.930699 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.931388 4735 status_manager.go:851] "Failed to get status for pod" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:56 crc kubenswrapper[4735]: E0317 01:14:56.931416 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:56 crc kubenswrapper[4735]: I0317 01:14:56.931806 4735 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Mar 17 01:14:57 crc kubenswrapper[4735]: I0317 01:14:57.937056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d16d1a51227f093d998f8771229c8bd831fbac7ad77bc8393392f1eb02fa72cc"} Mar 17 01:14:57 crc kubenswrapper[4735]: I0317 01:14:57.937324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ce80b086d670e95e1fe466c5b88d9fb43869ae99e6d67898697fcb9af2d6dd27"} Mar 17 01:14:57 crc kubenswrapper[4735]: I0317 01:14:57.937336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16d8a085711d5d8a1ae4baae18ea6e2e5b03d6168af19eb4ecddb65f115b004f"} Mar 17 01:14:57 crc kubenswrapper[4735]: I0317 01:14:57.937345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d0993359a8fcf258a5f79540fb6efbf4bd17d4a0fe112e06725feec2284d70e"} Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.968754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e5edfb63e8d31a4fc74ced48e15f09ebcfcb01d59a8c9d5714499c2b6c7a6abd"} Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.969175 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.969200 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.969498 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.975133 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.975744 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.975803 4735 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a" exitCode=1 Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.975838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a"} Mar 17 01:14:58 crc kubenswrapper[4735]: I0317 01:14:58.976399 4735 scope.go:117] "RemoveContainer" containerID="0750da82f3cca98968b1ba678a86116622c9dbc27f14d17aa556abfa83e9f99a" Mar 17 01:14:59 crc kubenswrapper[4735]: I0317 01:14:59.022776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:14:59 crc kubenswrapper[4735]: I0317 01:14:59.984744 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 01:14:59 crc kubenswrapper[4735]: I0317 01:14:59.986282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 01:14:59 crc kubenswrapper[4735]: I0317 01:14:59.986340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba8a888235849591fff2b99f6e531fc6197b7be264b0a2f45140b631a1aaebcd"} Mar 17 01:15:01 crc kubenswrapper[4735]: I0317 01:15:01.103230 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:01 crc kubenswrapper[4735]: I0317 01:15:01.103289 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:01 crc kubenswrapper[4735]: I0317 01:15:01.111329 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:03 crc kubenswrapper[4735]: I0317 01:15:03.981064 4735 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:04 crc kubenswrapper[4735]: I0317 01:15:04.014318 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:15:04 crc kubenswrapper[4735]: I0317 01:15:04.014347 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:15:04 crc kubenswrapper[4735]: I0317 01:15:04.017934 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:05 crc kubenswrapper[4735]: I0317 01:15:05.024141 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:15:05 crc kubenswrapper[4735]: I0317 01:15:05.024426 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a4ffb6dd-b8c0-42b3-819e-12ca3e7ea1c5" Mar 17 01:15:05 crc kubenswrapper[4735]: I0317 01:15:05.094534 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="74878eeb-74d5-4767-adec-c0176143bca2" Mar 17 01:15:06 crc kubenswrapper[4735]: I0317 01:15:06.596737 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:15:06 crc kubenswrapper[4735]: I0317 01:15:06.604479 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:15:07 crc kubenswrapper[4735]: I0317 01:15:07.042945 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:15:09 crc kubenswrapper[4735]: I0317 01:15:09.029338 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 01:15:12 crc kubenswrapper[4735]: I0317 01:15:12.380748 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 01:15:13 crc kubenswrapper[4735]: I0317 01:15:13.120257 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 01:15:14 crc kubenswrapper[4735]: I0317 01:15:14.572079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 01:15:14 crc kubenswrapper[4735]: I0317 01:15:14.946126 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 01:15:15 crc kubenswrapper[4735]: I0317 01:15:15.347691 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 01:15:15 crc kubenswrapper[4735]: I0317 01:15:15.379734 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 01:15:15 crc kubenswrapper[4735]: I0317 01:15:15.492192 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.045029 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.082210 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.271419 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.359144 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.426951 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.557215 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.584033 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.638567 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.703218 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.709222 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 01:15:16 crc kubenswrapper[4735]: I0317 01:15:16.998935 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 01:15:17 crc kubenswrapper[4735]: I0317 01:15:17.107888 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 01:15:17 crc kubenswrapper[4735]: I0317 01:15:17.255056 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 01:15:17 crc kubenswrapper[4735]: I0317 01:15:17.270297 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 01:15:17 crc kubenswrapper[4735]: I0317 01:15:17.371959 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 01:15:17 crc kubenswrapper[4735]: I0317 01:15:17.462018 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.308977 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.361265 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.543338 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.742031 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.747793 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.749486 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.819442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.823386 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.835024 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.869063 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 01:15:18 crc kubenswrapper[4735]: I0317 01:15:18.938109 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.068943 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.079770 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.107595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.108287 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.112975 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.175347 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.271024 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.296639 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.376425 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.432991 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.529717 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.659718 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.776702 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.861339 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 01:15:19 crc kubenswrapper[4735]: I0317 01:15:19.862418 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.041188 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.063769 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.077081 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.078164 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.117962 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.148013 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.217016 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.223029 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.260642 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.305231 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.361612 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.384038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.387175 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.624596 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.655099 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.670764 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.720179 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.721884 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.753297 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 01:15:20 crc kubenswrapper[4735]: I0317 01:15:20.944419 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.039426 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.100156 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.117935 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.181202 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.193123 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.239517 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.242595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.346350 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.391511 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.490356 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.528137 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.577770 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.582266 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.621151 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.622926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.653507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.698514 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.755357 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 01:15:21 crc kubenswrapper[4735]: I0317 01:15:21.881781 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.182976 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.229263 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.289977 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.435218 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.476725 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.502074 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.536603 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.551540 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.658194 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.738062 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.749399 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.882149 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.884510 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.985891 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 01:15:22 crc kubenswrapper[4735]: I0317 01:15:22.986798 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.032632 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.056625 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.082302 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.104482 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.106747 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.137414 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.151171 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.159314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.172283 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.206019 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.214022 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.257621 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.329081 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.396837 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.417025 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.490630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.508062 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.565716 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.573378 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.775763 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.817436 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.838181 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.868047 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.879112 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.892197 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.910038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.936430 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.938306 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 01:15:23 crc kubenswrapper[4735]: I0317 01:15:23.986524 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.082984 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.322620 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.360410 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.441409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.611386 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.679479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.719868 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.723274 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.760337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.820093 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 01:15:24 crc kubenswrapper[4735]: I0317 01:15:24.971939 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.006028 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.103288 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.112460 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.139541 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.139651 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.158788 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.179474 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.182590 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.203023 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.271565 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.306326 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.317964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.323927 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.364973 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.453368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.487006 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.553569 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.622206 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.633358 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.678749 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.757279 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.765024 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 01:15:25 crc kubenswrapper[4735]: I0317 01:15:25.776234 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.020194 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.027428 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.035169 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.099385 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.123934 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.173342 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.180703 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.252710 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.446280 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.472826 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.549685 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.554111 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.625820 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.639981 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.700040 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.810508 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.822875 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.885748 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.936282 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 01:15:26 crc kubenswrapper[4735]: I0317 01:15:26.988351 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.025130 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.097749 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.106898 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.160000 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.285517 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.323135 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.366299 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.390627 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.396837 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.462283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.518057 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.617173 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.623029 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.649245 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.675880 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.681668 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.819208 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.838819 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.846624 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.904757 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.932772 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.949712 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 01:15:27 crc kubenswrapper[4735]: I0317 01:15:27.958961 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.020201 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.221623 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.513207 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.549750 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.607598 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.624467 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.657028 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.753618 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.767542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.771043 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.900328 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.924996 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.944699 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 01:15:28 crc kubenswrapper[4735]: I0317 01:15:28.988063 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.007813 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.564710 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.735275 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.821099 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.888642 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 01:15:29 crc kubenswrapper[4735]: I0317 01:15:29.909073 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.148535 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.148905 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=46.148871484 podStartE2EDuration="46.148871484s" podCreationTimestamp="2026-03-17 01:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:15:04.018473015 +0000 UTC m=+329.650705993" watchObservedRunningTime="2026-03-17 01:15:30.148871484 +0000 UTC m=+355.781104472" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.154494 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.154557 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.159769 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.184623 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.184596864 podStartE2EDuration="27.184596864s" podCreationTimestamp="2026-03-17 01:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:15:30.18015184 +0000 UTC m=+355.812384828" watchObservedRunningTime="2026-03-17 01:15:30.184596864 +0000 UTC m=+355.816829882" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.185693 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.199538 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.279347 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.304012 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.315631 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.379288 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.767781 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl"] Mar 17 01:15:30 crc kubenswrapper[4735]: E0317 01:15:30.768035 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" containerName="installer" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.768047 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" containerName="installer" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.768151 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c97756b-7a02-446c-b81c-efee6a00f860" containerName="installer" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.768515 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.770837 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.771186 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.774392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl"] Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.796663 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.841489 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.915764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5dt\" (UniqueName: \"kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.915827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.915868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:30 crc kubenswrapper[4735]: I0317 01:15:30.941823 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.016634 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5dt\" (UniqueName: \"kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.016702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.016740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.017553 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.030787 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5dt\" (UniqueName: \"kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.031212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume\") pod \"collect-profiles-29561835-kwwfl\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.084645 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.127322 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.143728 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.213092 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.340041 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.465933 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.497841 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl"] Mar 17 01:15:31 crc kubenswrapper[4735]: I0317 01:15:31.733949 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.002468 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.115159 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.229058 4735 generic.go:334] "Generic (PLEG): container finished" podID="aee5ca9d-9692-4474-81a7-226fd9192272" containerID="a6cdb0e4ab56afba5560d96b4748f241772752f8bd94631191773f1e31aa3838" exitCode=0 Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.229103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" event={"ID":"aee5ca9d-9692-4474-81a7-226fd9192272","Type":"ContainerDied","Data":"a6cdb0e4ab56afba5560d96b4748f241772752f8bd94631191773f1e31aa3838"} Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.229146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" event={"ID":"aee5ca9d-9692-4474-81a7-226fd9192272","Type":"ContainerStarted","Data":"f78eef9e931e2be1b499d516f1d6031e5f074a4fd508afd25f27a27badaddc50"} Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.373483 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.448025 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.556983 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.641375 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 01:15:32 crc kubenswrapper[4735]: I0317 01:15:32.898421 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.600104 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.745997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume\") pod \"aee5ca9d-9692-4474-81a7-226fd9192272\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.746122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume\") pod \"aee5ca9d-9692-4474-81a7-226fd9192272\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.746218 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5dt\" (UniqueName: \"kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt\") pod \"aee5ca9d-9692-4474-81a7-226fd9192272\" (UID: \"aee5ca9d-9692-4474-81a7-226fd9192272\") " Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.747256 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume" (OuterVolumeSpecName: "config-volume") pod "aee5ca9d-9692-4474-81a7-226fd9192272" (UID: "aee5ca9d-9692-4474-81a7-226fd9192272"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.755444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt" (OuterVolumeSpecName: "kube-api-access-zk5dt") pod "aee5ca9d-9692-4474-81a7-226fd9192272" (UID: "aee5ca9d-9692-4474-81a7-226fd9192272"). InnerVolumeSpecName "kube-api-access-zk5dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.756285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aee5ca9d-9692-4474-81a7-226fd9192272" (UID: "aee5ca9d-9692-4474-81a7-226fd9192272"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.847794 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee5ca9d-9692-4474-81a7-226fd9192272-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.847895 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee5ca9d-9692-4474-81a7-226fd9192272-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:33 crc kubenswrapper[4735]: I0317 01:15:33.847923 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5dt\" (UniqueName: \"kubernetes.io/projected/aee5ca9d-9692-4474-81a7-226fd9192272-kube-api-access-zk5dt\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:34 crc kubenswrapper[4735]: I0317 01:15:34.245473 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" event={"ID":"aee5ca9d-9692-4474-81a7-226fd9192272","Type":"ContainerDied","Data":"f78eef9e931e2be1b499d516f1d6031e5f074a4fd508afd25f27a27badaddc50"} Mar 17 01:15:34 crc kubenswrapper[4735]: I0317 01:15:34.245540 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78eef9e931e2be1b499d516f1d6031e5f074a4fd508afd25f27a27badaddc50" Mar 17 01:15:34 crc kubenswrapper[4735]: I0317 01:15:34.245620 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl" Mar 17 01:15:37 crc kubenswrapper[4735]: I0317 01:15:37.743024 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 01:15:37 crc kubenswrapper[4735]: I0317 01:15:37.743716 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://645cc4a7987db8ab1e24af04b4cf5a310941ae5029ab3defb7e91ad369a46cd0" gracePeriod=5 Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.302568 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.304326 4735 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="645cc4a7987db8ab1e24af04b4cf5a310941ae5029ab3defb7e91ad369a46cd0" exitCode=137 Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.304496 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b03b1650db7ca6bf973bce4a820c522e1b443b9bf91ca66b6787a0859e5e83d" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.343043 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.343362 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495471 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495514 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495546 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495891 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495908 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.495923 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.521090 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.597674 4735 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.597728 4735 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.597747 4735 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.597766 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:43 crc kubenswrapper[4735]: I0317 01:15:43.597784 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:44 crc kubenswrapper[4735]: I0317 01:15:44.309620 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.079291 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.080117 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.089983 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.090028 4735 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="17414da0-2451-409d-bf0c-84546f2729ea" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.094291 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.094334 4735 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="17414da0-2451-409d-bf0c-84546f2729ea" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.424973 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.425867 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mxnnp" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="registry-server" containerID="cri-o://dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5" gracePeriod=30 Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.438157 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.438630 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-db76h" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="registry-server" containerID="cri-o://fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb" gracePeriod=30 Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.446964 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.447158 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" containerID="cri-o://05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53" gracePeriod=30 Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.460487 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.460822 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d4h2p" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="registry-server" containerID="cri-o://197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15" gracePeriod=30 Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.473180 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.473549 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-46vft" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="registry-server" containerID="cri-o://5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" gracePeriod=30 Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491026 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qkds"] Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.491227 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491242 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.491255 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee5ca9d-9692-4474-81a7-226fd9192272" containerName="collect-profiles" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491267 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee5ca9d-9692-4474-81a7-226fd9192272" containerName="collect-profiles" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491359 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee5ca9d-9692-4474-81a7-226fd9192272" containerName="collect-profiles" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491390 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.491735 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.503195 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qkds"] Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.625834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.625903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nxjl\" (UniqueName: \"kubernetes.io/projected/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-kube-api-access-9nxjl\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.625929 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.727304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.727350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nxjl\" (UniqueName: \"kubernetes.io/projected/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-kube-api-access-9nxjl\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.727373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.728766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.733008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.742235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nxjl\" (UniqueName: \"kubernetes.io/projected/f56c71b2-cf3c-4fe1-8c13-fd905c5a623d-kube-api-access-9nxjl\") pod \"marketplace-operator-79b997595-2qkds\" (UID: \"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.772427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.817653 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.928800 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content\") pod \"9f91af40-cf84-4b86-8aa5-fce087ec360d\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.928850 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fdtd\" (UniqueName: \"kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd\") pod \"9f91af40-cf84-4b86-8aa5-fce087ec360d\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.928890 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities\") pod \"9f91af40-cf84-4b86-8aa5-fce087ec360d\" (UID: \"9f91af40-cf84-4b86-8aa5-fce087ec360d\") " Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.933690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities" (OuterVolumeSpecName: "utilities") pod "9f91af40-cf84-4b86-8aa5-fce087ec360d" (UID: "9f91af40-cf84-4b86-8aa5-fce087ec360d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.937275 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd" (OuterVolumeSpecName: "kube-api-access-8fdtd") pod "9f91af40-cf84-4b86-8aa5-fce087ec360d" (UID: "9f91af40-cf84-4b86-8aa5-fce087ec360d"). InnerVolumeSpecName "kube-api-access-8fdtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.975521 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.983195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f91af40-cf84-4b86-8aa5-fce087ec360d" (UID: "9f91af40-cf84-4b86-8aa5-fce087ec360d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.984481 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 is running failed: container process not found" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.984813 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 is running failed: container process not found" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.985066 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 is running failed: container process not found" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 01:15:45 crc kubenswrapper[4735]: E0317 01:15:45.985106 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-46vft" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="registry-server" Mar 17 01:15:45 crc kubenswrapper[4735]: I0317 01:15:45.989161 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.009201 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.029970 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.029997 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fdtd\" (UniqueName: \"kubernetes.io/projected/9f91af40-cf84-4b86-8aa5-fce087ec360d-kube-api-access-8fdtd\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.030007 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f91af40-cf84-4b86-8aa5-fce087ec360d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.033168 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities\") pod \"54e396e2-3911-4d16-9ff4-588b49a8a77c\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities\") pod \"1a4643c9-428f-46de-b795-44c73e85d7f9\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130847 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwdhh\" (UniqueName: \"kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh\") pod \"54e396e2-3911-4d16-9ff4-588b49a8a77c\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics\") pod \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130938 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content\") pod \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d8fr\" (UniqueName: \"kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr\") pod \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.130993 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content\") pod \"1a4643c9-428f-46de-b795-44c73e85d7f9\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131030 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities\") pod \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\" (UID: \"277daa3a-bd0c-46b3-915f-1050fbfa37ac\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131051 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca\") pod \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131067 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7cdt\" (UniqueName: \"kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt\") pod \"1a4643c9-428f-46de-b795-44c73e85d7f9\" (UID: \"1a4643c9-428f-46de-b795-44c73e85d7f9\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131086 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzw5l\" (UniqueName: \"kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l\") pod \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\" (UID: \"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131106 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content\") pod \"54e396e2-3911-4d16-9ff4-588b49a8a77c\" (UID: \"54e396e2-3911-4d16-9ff4-588b49a8a77c\") " Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities" (OuterVolumeSpecName: "utilities") pod "1a4643c9-428f-46de-b795-44c73e85d7f9" (UID: "1a4643c9-428f-46de-b795-44c73e85d7f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.131494 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities" (OuterVolumeSpecName: "utilities") pod "54e396e2-3911-4d16-9ff4-588b49a8a77c" (UID: "54e396e2-3911-4d16-9ff4-588b49a8a77c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.132139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" (UID: "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.133053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities" (OuterVolumeSpecName: "utilities") pod "277daa3a-bd0c-46b3-915f-1050fbfa37ac" (UID: "277daa3a-bd0c-46b3-915f-1050fbfa37ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.135557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt" (OuterVolumeSpecName: "kube-api-access-g7cdt") pod "1a4643c9-428f-46de-b795-44c73e85d7f9" (UID: "1a4643c9-428f-46de-b795-44c73e85d7f9"). InnerVolumeSpecName "kube-api-access-g7cdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.136053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l" (OuterVolumeSpecName: "kube-api-access-wzw5l") pod "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" (UID: "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c"). InnerVolumeSpecName "kube-api-access-wzw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.138382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh" (OuterVolumeSpecName: "kube-api-access-hwdhh") pod "54e396e2-3911-4d16-9ff4-588b49a8a77c" (UID: "54e396e2-3911-4d16-9ff4-588b49a8a77c"). InnerVolumeSpecName "kube-api-access-hwdhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.138437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" (UID: "2dcdba3a-a61a-4a1d-a3db-19dd90826b4c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.139820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr" (OuterVolumeSpecName: "kube-api-access-7d8fr") pod "277daa3a-bd0c-46b3-915f-1050fbfa37ac" (UID: "277daa3a-bd0c-46b3-915f-1050fbfa37ac"). InnerVolumeSpecName "kube-api-access-7d8fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.160015 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "277daa3a-bd0c-46b3-915f-1050fbfa37ac" (UID: "277daa3a-bd0c-46b3-915f-1050fbfa37ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.195061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a4643c9-428f-46de-b795-44c73e85d7f9" (UID: "1a4643c9-428f-46de-b795-44c73e85d7f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232569 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232603 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232616 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7cdt\" (UniqueName: \"kubernetes.io/projected/1a4643c9-428f-46de-b795-44c73e85d7f9-kube-api-access-g7cdt\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232627 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzw5l\" (UniqueName: \"kubernetes.io/projected/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-kube-api-access-wzw5l\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232636 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232647 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232655 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwdhh\" (UniqueName: \"kubernetes.io/projected/54e396e2-3911-4d16-9ff4-588b49a8a77c-kube-api-access-hwdhh\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232665 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232673 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277daa3a-bd0c-46b3-915f-1050fbfa37ac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232682 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d8fr\" (UniqueName: \"kubernetes.io/projected/277daa3a-bd0c-46b3-915f-1050fbfa37ac-kube-api-access-7d8fr\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.232690 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a4643c9-428f-46de-b795-44c73e85d7f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.259816 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54e396e2-3911-4d16-9ff4-588b49a8a77c" (UID: "54e396e2-3911-4d16-9ff4-588b49a8a77c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.269595 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qkds"] Mar 17 01:15:46 crc kubenswrapper[4735]: W0317 01:15:46.274927 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56c71b2_cf3c_4fe1_8c13_fd905c5a623d.slice/crio-249cfe31d505dab94728769aca2047ca0ae481a3c3e9e11b3fef3bda2dbc1a9a WatchSource:0}: Error finding container 249cfe31d505dab94728769aca2047ca0ae481a3c3e9e11b3fef3bda2dbc1a9a: Status 404 returned error can't find the container with id 249cfe31d505dab94728769aca2047ca0ae481a3c3e9e11b3fef3bda2dbc1a9a Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.322341 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46vft" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.322342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerDied","Data":"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.322450 4735 scope.go:117] "RemoveContainer" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.322228 4735 generic.go:334] "Generic (PLEG): container finished" podID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.328997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46vft" event={"ID":"54e396e2-3911-4d16-9ff4-588b49a8a77c","Type":"ContainerDied","Data":"842f025fb6c818013c1fcbffedfb5458c360fcb92cdd6a705fa061eecd375a8a"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.333976 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e396e2-3911-4d16-9ff4-588b49a8a77c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.335895 4735 scope.go:117] "RemoveContainer" containerID="19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.336128 4735 generic.go:334] "Generic (PLEG): container finished" podID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerID="197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.336183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerDied","Data":"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.336208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4h2p" event={"ID":"277daa3a-bd0c-46b3-915f-1050fbfa37ac","Type":"ContainerDied","Data":"ef91125e4d65a5ed53caa166d91c694e045d2c84a1cffd70ac531a9437490336"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.336269 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4h2p" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.337941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" event={"ID":"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d","Type":"ContainerStarted","Data":"249cfe31d505dab94728769aca2047ca0ae481a3c3e9e11b3fef3bda2dbc1a9a"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.340961 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerID="fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.341001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerDied","Data":"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.341019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-db76h" event={"ID":"1a4643c9-428f-46de-b795-44c73e85d7f9","Type":"ContainerDied","Data":"1057dbe8711cf87900956da3f54d1a037225b396d4ac3380ecdaa2be9366a11a"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.341086 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-db76h" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.347845 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerID="05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.347919 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.348000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" event={"ID":"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c","Type":"ContainerDied","Data":"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.348023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rpxx" event={"ID":"2dcdba3a-a61a-4a1d-a3db-19dd90826b4c","Type":"ContainerDied","Data":"f68c47ed97796425f9488ce2ad4fae98b5c5f36b8ce344b4add434ad07238311"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.354709 4735 generic.go:334] "Generic (PLEG): container finished" podID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerID="dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.354766 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerDied","Data":"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.354841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxnnp" event={"ID":"9f91af40-cf84-4b86-8aa5-fce087ec360d","Type":"ContainerDied","Data":"ad46de335b6e558843ac26e8849df144da9ea070c70df84c806e37f48d628498"} Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.354794 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxnnp" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.359756 4735 scope.go:117] "RemoveContainer" containerID="bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.372900 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.379011 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4h2p"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.387783 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.399282 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-46vft"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.402818 4735 scope.go:117] "RemoveContainer" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.403302 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172\": container with ID starting with 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 not found: ID does not exist" containerID="5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.403328 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172"} err="failed to get container status \"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172\": rpc error: code = NotFound desc = could not find container \"5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172\": container with ID starting with 5845e76df50a0ae28af875612dcb3777995fe68b88e32b4d1f05a1b3c3c7c172 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.403345 4735 scope.go:117] "RemoveContainer" containerID="19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.403651 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f\": container with ID starting with 19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f not found: ID does not exist" containerID="19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.403685 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f"} err="failed to get container status \"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f\": rpc error: code = NotFound desc = could not find container \"19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f\": container with ID starting with 19eabd7e47f3d86e3ac3d3452b0fc62f4a25318cb7f7d05e37aaabf49427210f not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.403697 4735 scope.go:117] "RemoveContainer" containerID="bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.404457 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18\": container with ID starting with bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18 not found: ID does not exist" containerID="bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.404475 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18"} err="failed to get container status \"bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18\": rpc error: code = NotFound desc = could not find container \"bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18\": container with ID starting with bcdd5402563b04e276aff97277a394d72d1385e06d605543c87954f6e8e4dc18 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.404498 4735 scope.go:117] "RemoveContainer" containerID="197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.422498 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.424608 4735 scope.go:117] "RemoveContainer" containerID="dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.438804 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-db76h"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.448531 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.449555 4735 scope.go:117] "RemoveContainer" containerID="76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.457121 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rpxx"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.465958 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.466692 4735 scope.go:117] "RemoveContainer" containerID="197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.467166 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15\": container with ID starting with 197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15 not found: ID does not exist" containerID="197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.467217 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15"} err="failed to get container status \"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15\": rpc error: code = NotFound desc = could not find container \"197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15\": container with ID starting with 197bb9ec304bedc1e752de0515b376ca26766915c232bb359208cf3ecdf5ef15 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.467256 4735 scope.go:117] "RemoveContainer" containerID="dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.467737 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1\": container with ID starting with dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1 not found: ID does not exist" containerID="dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.467782 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1"} err="failed to get container status \"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1\": rpc error: code = NotFound desc = could not find container \"dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1\": container with ID starting with dc52a02295d94dccf3deff44a85b8b85d440a5c831d3b1a24f90d84a024bc3c1 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.467796 4735 scope.go:117] "RemoveContainer" containerID="76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.468331 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3\": container with ID starting with 76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3 not found: ID does not exist" containerID="76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.468375 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3"} err="failed to get container status \"76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3\": rpc error: code = NotFound desc = could not find container \"76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3\": container with ID starting with 76b85ab14a517b605a6fe0d2849d918db16a6548a5fb791eb370d959f7735ad3 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.468406 4735 scope.go:117] "RemoveContainer" containerID="fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.473291 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mxnnp"] Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.480053 4735 scope.go:117] "RemoveContainer" containerID="1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.498125 4735 scope.go:117] "RemoveContainer" containerID="ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.525006 4735 scope.go:117] "RemoveContainer" containerID="fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.525307 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb\": container with ID starting with fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb not found: ID does not exist" containerID="fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.525333 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb"} err="failed to get container status \"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb\": rpc error: code = NotFound desc = could not find container \"fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb\": container with ID starting with fe6382e874ae423ece62076e21beaec96edb5aceaac82bb0b20a5086001bf7bb not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.525354 4735 scope.go:117] "RemoveContainer" containerID="1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.525664 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23\": container with ID starting with 1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23 not found: ID does not exist" containerID="1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.525688 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23"} err="failed to get container status \"1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23\": rpc error: code = NotFound desc = could not find container \"1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23\": container with ID starting with 1c9bd35e5f412af816b442911fcc6e1f1f4204ee6572d603e70f903e02089a23 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.525704 4735 scope.go:117] "RemoveContainer" containerID="ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.526264 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51\": container with ID starting with ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51 not found: ID does not exist" containerID="ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.526285 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51"} err="failed to get container status \"ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51\": rpc error: code = NotFound desc = could not find container \"ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51\": container with ID starting with ec6bc450f125079b1ff8f5bfed5fe00fc6648d9a5118b77e6469d6d59c6d3e51 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.526297 4735 scope.go:117] "RemoveContainer" containerID="05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.536563 4735 scope.go:117] "RemoveContainer" containerID="05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.536905 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53\": container with ID starting with 05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53 not found: ID does not exist" containerID="05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.536926 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53"} err="failed to get container status \"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53\": rpc error: code = NotFound desc = could not find container \"05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53\": container with ID starting with 05e9cd1dd8f3dc066d621009e1ab66ac6677de399ec02b788ca6dab5801a0b53 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.536944 4735 scope.go:117] "RemoveContainer" containerID="dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.549832 4735 scope.go:117] "RemoveContainer" containerID="687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.565030 4735 scope.go:117] "RemoveContainer" containerID="5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.620499 4735 scope.go:117] "RemoveContainer" containerID="dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.620933 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5\": container with ID starting with dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5 not found: ID does not exist" containerID="dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.620960 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5"} err="failed to get container status \"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5\": rpc error: code = NotFound desc = could not find container \"dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5\": container with ID starting with dc18c5466c9bc83daf559d76a7ddc45dba200130e64ccd5819e486d87122e4a5 not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.620980 4735 scope.go:117] "RemoveContainer" containerID="687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.621351 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce\": container with ID starting with 687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce not found: ID does not exist" containerID="687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.621374 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce"} err="failed to get container status \"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce\": rpc error: code = NotFound desc = could not find container \"687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce\": container with ID starting with 687c4d48a8f6aed26dd4aede9f19686fa7aa43a4c0892f6d573cef2c00ac07ce not found: ID does not exist" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.621387 4735 scope.go:117] "RemoveContainer" containerID="5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78" Mar 17 01:15:46 crc kubenswrapper[4735]: E0317 01:15:46.621832 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78\": container with ID starting with 5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78 not found: ID does not exist" containerID="5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78" Mar 17 01:15:46 crc kubenswrapper[4735]: I0317 01:15:46.621870 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78"} err="failed to get container status \"5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78\": rpc error: code = NotFound desc = could not find container \"5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78\": container with ID starting with 5319dd7f17b8f1b45f5571b09eb2da2c887df61462afc55d9da749084f0f8b78 not found: ID does not exist" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.078320 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" path="/var/lib/kubelet/pods/1a4643c9-428f-46de-b795-44c73e85d7f9/volumes" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.079265 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" path="/var/lib/kubelet/pods/277daa3a-bd0c-46b3-915f-1050fbfa37ac/volumes" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.079946 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" path="/var/lib/kubelet/pods/2dcdba3a-a61a-4a1d-a3db-19dd90826b4c/volumes" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.080417 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" path="/var/lib/kubelet/pods/54e396e2-3911-4d16-9ff4-588b49a8a77c/volumes" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.081449 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" path="/var/lib/kubelet/pods/9f91af40-cf84-4b86-8aa5-fce087ec360d/volumes" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.370779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" event={"ID":"f56c71b2-cf3c-4fe1-8c13-fd905c5a623d","Type":"ContainerStarted","Data":"3397a2f73636cc9c522153db6d72f3505a79804ee067bb58cc45194a6e01bd11"} Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.371076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.375061 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" Mar 17 01:15:47 crc kubenswrapper[4735]: I0317 01:15:47.388200 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2qkds" podStartSLOduration=2.38818685 podStartE2EDuration="2.38818685s" podCreationTimestamp="2026-03-17 01:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:15:47.384961877 +0000 UTC m=+373.017194905" watchObservedRunningTime="2026-03-17 01:15:47.38818685 +0000 UTC m=+373.020419828" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.149648 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561836-7zvsl"] Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150237 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150248 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150256 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150262 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150271 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150276 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150284 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150313 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150319 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150330 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150335 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150344 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150350 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150358 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150366 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150373 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150379 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="extract-utilities" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150386 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150391 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="extract-content" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150399 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150404 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: E0317 01:16:00.150412 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150417 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150494 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f91af40-cf84-4b86-8aa5-fce087ec360d" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150501 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcdba3a-a61a-4a1d-a3db-19dd90826b4c" containerName="marketplace-operator" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150507 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4643c9-428f-46de-b795-44c73e85d7f9" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150517 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="277daa3a-bd0c-46b3-915f-1050fbfa37ac" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150525 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e396e2-3911-4d16-9ff4-588b49a8a77c" containerName="registry-server" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.150846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.154452 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.154715 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.156643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.223078 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-7zvsl"] Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.341760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgght\" (UniqueName: \"kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght\") pod \"auto-csr-approver-29561836-7zvsl\" (UID: \"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48\") " pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.443615 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgght\" (UniqueName: \"kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght\") pod \"auto-csr-approver-29561836-7zvsl\" (UID: \"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48\") " pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.476489 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgght\" (UniqueName: \"kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght\") pod \"auto-csr-approver-29561836-7zvsl\" (UID: \"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48\") " pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:00 crc kubenswrapper[4735]: I0317 01:16:00.769350 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:01 crc kubenswrapper[4735]: I0317 01:16:01.226276 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-7zvsl"] Mar 17 01:16:01 crc kubenswrapper[4735]: I0317 01:16:01.464245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" event={"ID":"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48","Type":"ContainerStarted","Data":"287fcf9f32466a097a41d416aab4760b4063ed3d25d66fc0f7dab2673f5e4c55"} Mar 17 01:16:03 crc kubenswrapper[4735]: I0317 01:16:03.483082 4735 generic.go:334] "Generic (PLEG): container finished" podID="cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" containerID="52e2d20b79c2019eec8fd538dfe4bdfeda4fc7eb6df5b8df4c53d7fe63de8b2a" exitCode=0 Mar 17 01:16:03 crc kubenswrapper[4735]: I0317 01:16:03.483190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" event={"ID":"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48","Type":"ContainerDied","Data":"52e2d20b79c2019eec8fd538dfe4bdfeda4fc7eb6df5b8df4c53d7fe63de8b2a"} Mar 17 01:16:04 crc kubenswrapper[4735]: I0317 01:16:04.835214 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:04 crc kubenswrapper[4735]: I0317 01:16:04.901231 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgght\" (UniqueName: \"kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght\") pod \"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48\" (UID: \"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48\") " Mar 17 01:16:04 crc kubenswrapper[4735]: I0317 01:16:04.908849 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght" (OuterVolumeSpecName: "kube-api-access-fgght") pod "cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" (UID: "cca4f1b3-ff9b-4e10-b32e-844d67c4ae48"). InnerVolumeSpecName "kube-api-access-fgght". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:16:05 crc kubenswrapper[4735]: I0317 01:16:05.002482 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgght\" (UniqueName: \"kubernetes.io/projected/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48-kube-api-access-fgght\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:05 crc kubenswrapper[4735]: I0317 01:16:05.499764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" event={"ID":"cca4f1b3-ff9b-4e10-b32e-844d67c4ae48","Type":"ContainerDied","Data":"287fcf9f32466a097a41d416aab4760b4063ed3d25d66fc0f7dab2673f5e4c55"} Mar 17 01:16:05 crc kubenswrapper[4735]: I0317 01:16:05.500184 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287fcf9f32466a097a41d416aab4760b4063ed3d25d66fc0f7dab2673f5e4c55" Mar 17 01:16:05 crc kubenswrapper[4735]: I0317 01:16:05.499839 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-7zvsl" Mar 17 01:16:12 crc kubenswrapper[4735]: I0317 01:16:12.606255 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:16:12 crc kubenswrapper[4735]: I0317 01:16:12.606882 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:16:42 crc kubenswrapper[4735]: I0317 01:16:42.606284 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:16:42 crc kubenswrapper[4735]: I0317 01:16:42.607003 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:16:45 crc kubenswrapper[4735]: I0317 01:16:45.983237 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nr9qz"] Mar 17 01:16:45 crc kubenswrapper[4735]: E0317 01:16:45.983487 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" containerName="oc" Mar 17 01:16:45 crc kubenswrapper[4735]: I0317 01:16:45.983504 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" containerName="oc" Mar 17 01:16:45 crc kubenswrapper[4735]: I0317 01:16:45.983605 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" containerName="oc" Mar 17 01:16:45 crc kubenswrapper[4735]: I0317 01:16:45.984106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.000800 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nr9qz"] Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1798579-bd73-4016-8cb6-a4497b128fb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-certificates\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-trusted-ca\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111361 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1798579-bd73-4016-8cb6-a4497b128fb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111381 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-tls\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111399 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4djk\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-kube-api-access-g4djk\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.111450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-bound-sa-token\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.129596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.212692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1798579-bd73-4016-8cb6-a4497b128fb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.213044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-certificates\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.213071 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-trusted-ca\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.213104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1798579-bd73-4016-8cb6-a4497b128fb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.213132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-tls\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.214009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4djk\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-kube-api-access-g4djk\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.214070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-bound-sa-token\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.213200 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1798579-bd73-4016-8cb6-a4497b128fb6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.214495 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-certificates\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.214991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1798579-bd73-4016-8cb6-a4497b128fb6-trusted-ca\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.218178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1798579-bd73-4016-8cb6-a4497b128fb6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.219203 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-registry-tls\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.233027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-bound-sa-token\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.238286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4djk\" (UniqueName: \"kubernetes.io/projected/d1798579-bd73-4016-8cb6-a4497b128fb6-kube-api-access-g4djk\") pod \"image-registry-66df7c8f76-nr9qz\" (UID: \"d1798579-bd73-4016-8cb6-a4497b128fb6\") " pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.302366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.539593 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nr9qz"] Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.804609 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" event={"ID":"d1798579-bd73-4016-8cb6-a4497b128fb6","Type":"ContainerStarted","Data":"cc263b4ff398a754e72cce3e670f6c34d1ffa2f6758fe851e6a263a68935dee0"} Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.804672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" event={"ID":"d1798579-bd73-4016-8cb6-a4497b128fb6","Type":"ContainerStarted","Data":"b2cfa82f632f585a145ad78087ca26fac3e2133e83bd7961d9792eb2a0a27bf6"} Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.804716 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:16:46 crc kubenswrapper[4735]: I0317 01:16:46.825360 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" podStartSLOduration=1.825336182 podStartE2EDuration="1.825336182s" podCreationTimestamp="2026-03-17 01:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:16:46.821829462 +0000 UTC m=+432.454062470" watchObservedRunningTime="2026-03-17 01:16:46.825336182 +0000 UTC m=+432.457569210" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.295613 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7k4g"] Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.298179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.301941 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.325212 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7k4g"] Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.387742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwn8\" (UniqueName: \"kubernetes.io/projected/93613795-fcb8-40c7-a9d7-5001de165a69-kube-api-access-tzwn8\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.387835 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-catalog-content\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.387874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-utilities\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.485228 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6nl7"] Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.486352 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.491094 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.491944 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwn8\" (UniqueName: \"kubernetes.io/projected/93613795-fcb8-40c7-a9d7-5001de165a69-kube-api-access-tzwn8\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.492028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-utilities\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.492051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-catalog-content\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.492573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-catalog-content\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.492955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93613795-fcb8-40c7-a9d7-5001de165a69-utilities\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.515578 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6nl7"] Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.523559 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwn8\" (UniqueName: \"kubernetes.io/projected/93613795-fcb8-40c7-a9d7-5001de165a69-kube-api-access-tzwn8\") pod \"redhat-marketplace-m7k4g\" (UID: \"93613795-fcb8-40c7-a9d7-5001de165a69\") " pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.593798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngmj\" (UniqueName: \"kubernetes.io/projected/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-kube-api-access-lngmj\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.594333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-utilities\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.594515 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-catalog-content\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.634084 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.696379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-utilities\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.696762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-catalog-content\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.696822 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngmj\" (UniqueName: \"kubernetes.io/projected/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-kube-api-access-lngmj\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.697927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-utilities\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.698357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-catalog-content\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.726465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngmj\" (UniqueName: \"kubernetes.io/projected/8ce6a4e4-dae9-4138-871c-c9c3c05641e6-kube-api-access-lngmj\") pod \"redhat-operators-p6nl7\" (UID: \"8ce6a4e4-dae9-4138-871c-c9c3c05641e6\") " pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.809749 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:16:58 crc kubenswrapper[4735]: I0317 01:16:58.999809 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6nl7"] Mar 17 01:16:59 crc kubenswrapper[4735]: W0317 01:16:59.006363 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce6a4e4_dae9_4138_871c_c9c3c05641e6.slice/crio-167d425a7aebbdb066dd2afa6f792036c848b3950b31c9e2bc2d98c12774f3cf WatchSource:0}: Error finding container 167d425a7aebbdb066dd2afa6f792036c848b3950b31c9e2bc2d98c12774f3cf: Status 404 returned error can't find the container with id 167d425a7aebbdb066dd2afa6f792036c848b3950b31c9e2bc2d98c12774f3cf Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.046820 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7k4g"] Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.884956 4735 generic.go:334] "Generic (PLEG): container finished" podID="93613795-fcb8-40c7-a9d7-5001de165a69" containerID="b3cac34d19c0d6872eed373b27618dd926b4fafd8ef6fc341dcdee18900fcf33" exitCode=0 Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.885013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7k4g" event={"ID":"93613795-fcb8-40c7-a9d7-5001de165a69","Type":"ContainerDied","Data":"b3cac34d19c0d6872eed373b27618dd926b4fafd8ef6fc341dcdee18900fcf33"} Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.885086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7k4g" event={"ID":"93613795-fcb8-40c7-a9d7-5001de165a69","Type":"ContainerStarted","Data":"4d00bd6527b5fdb37bf1ef01f4aca7e490be146319ca7707c56a566c9c0ee6bb"} Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.887765 4735 generic.go:334] "Generic (PLEG): container finished" podID="8ce6a4e4-dae9-4138-871c-c9c3c05641e6" containerID="f574ee2fdaedbff0f2bf7c68aec693bc1c247d9ce8e80104fbd2bc5dd7e2339b" exitCode=0 Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.887821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nl7" event={"ID":"8ce6a4e4-dae9-4138-871c-c9c3c05641e6","Type":"ContainerDied","Data":"f574ee2fdaedbff0f2bf7c68aec693bc1c247d9ce8e80104fbd2bc5dd7e2339b"} Mar 17 01:16:59 crc kubenswrapper[4735]: I0317 01:16:59.887897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nl7" event={"ID":"8ce6a4e4-dae9-4138-871c-c9c3c05641e6","Type":"ContainerStarted","Data":"167d425a7aebbdb066dd2afa6f792036c848b3950b31c9e2bc2d98c12774f3cf"} Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.094166 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.096321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.097701 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.098997 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.220741 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.220805 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlshr\" (UniqueName: \"kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.220926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.322171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.322230 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlshr\" (UniqueName: \"kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.322287 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.323251 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.323686 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.348071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlshr\" (UniqueName: \"kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr\") pod \"community-operators-n6ssc\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.424957 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.654071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.896680 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nl7" event={"ID":"8ce6a4e4-dae9-4138-871c-c9c3c05641e6","Type":"ContainerStarted","Data":"09eed5513fb5d5ad7a552e93d405790c7e439f78ed7c7d2d3f710d05eb0298bc"} Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.899086 4735 generic.go:334] "Generic (PLEG): container finished" podID="93613795-fcb8-40c7-a9d7-5001de165a69" containerID="3c77da61fc8ba76b7be3101193260af365f0b0f7d0fc3db6dcb291c2813744a0" exitCode=0 Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.899187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7k4g" event={"ID":"93613795-fcb8-40c7-a9d7-5001de165a69","Type":"ContainerDied","Data":"3c77da61fc8ba76b7be3101193260af365f0b0f7d0fc3db6dcb291c2813744a0"} Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.901116 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerID="544ca71f24a1553dace12d8780827c9b29d4ae04c20385ded1a1128723898622" exitCode=0 Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.901145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerDied","Data":"544ca71f24a1553dace12d8780827c9b29d4ae04c20385ded1a1128723898622"} Mar 17 01:17:00 crc kubenswrapper[4735]: I0317 01:17:00.901167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerStarted","Data":"c9b373c6ad1ca5ad288d4222bf12a32155b7a8f581bd8a419712c013d165dbac"} Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.492675 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.494096 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.495952 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.515670 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.638040 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8z8\" (UniqueName: \"kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.638135 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.638261 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.739518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8z8\" (UniqueName: \"kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.739560 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.739625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.739988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.740088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.767250 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8z8\" (UniqueName: \"kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8\") pod \"certified-operators-9sppg\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.825330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.911064 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerID="075e8800cbf5e47ac99627981e2d61f9cdbde20ed9985443a1be16919c8cdc8e" exitCode=0 Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.911171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerDied","Data":"075e8800cbf5e47ac99627981e2d61f9cdbde20ed9985443a1be16919c8cdc8e"} Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.930954 4735 generic.go:334] "Generic (PLEG): container finished" podID="8ce6a4e4-dae9-4138-871c-c9c3c05641e6" containerID="09eed5513fb5d5ad7a552e93d405790c7e439f78ed7c7d2d3f710d05eb0298bc" exitCode=0 Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.931031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nl7" event={"ID":"8ce6a4e4-dae9-4138-871c-c9c3c05641e6","Type":"ContainerDied","Data":"09eed5513fb5d5ad7a552e93d405790c7e439f78ed7c7d2d3f710d05eb0298bc"} Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.935714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7k4g" event={"ID":"93613795-fcb8-40c7-a9d7-5001de165a69","Type":"ContainerStarted","Data":"19329f318289cecacb7affebac137a711ca48cacde15c43fd134d11bdea560c3"} Mar 17 01:17:01 crc kubenswrapper[4735]: I0317 01:17:01.958986 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7k4g" podStartSLOduration=2.463589129 podStartE2EDuration="3.95896972s" podCreationTimestamp="2026-03-17 01:16:58 +0000 UTC" firstStartedPulling="2026-03-17 01:16:59.889537284 +0000 UTC m=+445.521770302" lastFinishedPulling="2026-03-17 01:17:01.384917875 +0000 UTC m=+447.017150893" observedRunningTime="2026-03-17 01:17:01.958390316 +0000 UTC m=+447.590623294" watchObservedRunningTime="2026-03-17 01:17:01.95896972 +0000 UTC m=+447.591202698" Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.058221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.947772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerStarted","Data":"6b34533a5ae373b7adc1035147c72c494f45e0974d3cdd0ea4f48b57797ef6f3"} Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.949918 4735 generic.go:334] "Generic (PLEG): container finished" podID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerID="d3476732ac9c7a11b80e6341abb3f3f81f7fa92f2777a9e014f9cd164ca59a61" exitCode=0 Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.950065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerDied","Data":"d3476732ac9c7a11b80e6341abb3f3f81f7fa92f2777a9e014f9cd164ca59a61"} Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.950147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerStarted","Data":"e5a8a1536d93b308b8568799944693dc5fd93abf1165c6a59ea232a0a5f73857"} Mar 17 01:17:02 crc kubenswrapper[4735]: I0317 01:17:02.980311 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6ssc" podStartSLOduration=1.406915105 podStartE2EDuration="2.980293648s" podCreationTimestamp="2026-03-17 01:17:00 +0000 UTC" firstStartedPulling="2026-03-17 01:17:00.902152091 +0000 UTC m=+446.534385069" lastFinishedPulling="2026-03-17 01:17:02.475530604 +0000 UTC m=+448.107763612" observedRunningTime="2026-03-17 01:17:02.978788 +0000 UTC m=+448.611021008" watchObservedRunningTime="2026-03-17 01:17:02.980293648 +0000 UTC m=+448.612526626" Mar 17 01:17:03 crc kubenswrapper[4735]: I0317 01:17:03.959127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nl7" event={"ID":"8ce6a4e4-dae9-4138-871c-c9c3c05641e6","Type":"ContainerStarted","Data":"b8a48794096f5fbafc8abd333877bc8fb4c4a1fa8846b10db78b0d2ad42b82bd"} Mar 17 01:17:03 crc kubenswrapper[4735]: I0317 01:17:03.963144 4735 generic.go:334] "Generic (PLEG): container finished" podID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerID="87b5b9f1a2d765d08cb6dd2182b286fb6974d93ffc1ea6ba6419fd82a78fef95" exitCode=0 Mar 17 01:17:03 crc kubenswrapper[4735]: I0317 01:17:03.965468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerDied","Data":"87b5b9f1a2d765d08cb6dd2182b286fb6974d93ffc1ea6ba6419fd82a78fef95"} Mar 17 01:17:04 crc kubenswrapper[4735]: I0317 01:17:04.010032 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6nl7" podStartSLOduration=2.5805133209999997 podStartE2EDuration="6.010015238s" podCreationTimestamp="2026-03-17 01:16:58 +0000 UTC" firstStartedPulling="2026-03-17 01:16:59.889416651 +0000 UTC m=+445.521649639" lastFinishedPulling="2026-03-17 01:17:03.318918538 +0000 UTC m=+448.951151556" observedRunningTime="2026-03-17 01:17:03.991651262 +0000 UTC m=+449.623884240" watchObservedRunningTime="2026-03-17 01:17:04.010015238 +0000 UTC m=+449.642248216" Mar 17 01:17:04 crc kubenswrapper[4735]: I0317 01:17:04.971106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerStarted","Data":"0092f459ef06a1b51c84737b1c3cf7425ae7bc5a4a5284c2ff56ae125fb55a1e"} Mar 17 01:17:04 crc kubenswrapper[4735]: I0317 01:17:04.999057 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9sppg" podStartSLOduration=2.615383765 podStartE2EDuration="3.999037115s" podCreationTimestamp="2026-03-17 01:17:01 +0000 UTC" firstStartedPulling="2026-03-17 01:17:02.951332121 +0000 UTC m=+448.583565129" lastFinishedPulling="2026-03-17 01:17:04.334985491 +0000 UTC m=+449.967218479" observedRunningTime="2026-03-17 01:17:04.9953352 +0000 UTC m=+450.627568178" watchObservedRunningTime="2026-03-17 01:17:04.999037115 +0000 UTC m=+450.631270093" Mar 17 01:17:06 crc kubenswrapper[4735]: I0317 01:17:06.306900 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nr9qz" Mar 17 01:17:06 crc kubenswrapper[4735]: I0317 01:17:06.367166 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:17:08 crc kubenswrapper[4735]: I0317 01:17:08.635382 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:17:08 crc kubenswrapper[4735]: I0317 01:17:08.635455 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:17:08 crc kubenswrapper[4735]: I0317 01:17:08.719960 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:17:08 crc kubenswrapper[4735]: I0317 01:17:08.810503 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:17:08 crc kubenswrapper[4735]: I0317 01:17:08.811297 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:17:09 crc kubenswrapper[4735]: I0317 01:17:09.034909 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7k4g" Mar 17 01:17:09 crc kubenswrapper[4735]: I0317 01:17:09.871793 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6nl7" podUID="8ce6a4e4-dae9-4138-871c-c9c3c05641e6" containerName="registry-server" probeResult="failure" output=< Mar 17 01:17:09 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:17:09 crc kubenswrapper[4735]: > Mar 17 01:17:10 crc kubenswrapper[4735]: I0317 01:17:10.425639 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:10 crc kubenswrapper[4735]: I0317 01:17:10.425742 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:10 crc kubenswrapper[4735]: I0317 01:17:10.499092 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:11 crc kubenswrapper[4735]: I0317 01:17:11.065809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 01:17:11 crc kubenswrapper[4735]: I0317 01:17:11.825829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:11 crc kubenswrapper[4735]: I0317 01:17:11.827563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:11 crc kubenswrapper[4735]: I0317 01:17:11.887359 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.085830 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.606828 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.606929 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.606982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.607711 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:17:12 crc kubenswrapper[4735]: I0317 01:17:12.608720 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca" gracePeriod=600 Mar 17 01:17:13 crc kubenswrapper[4735]: I0317 01:17:13.024993 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca" exitCode=0 Mar 17 01:17:13 crc kubenswrapper[4735]: I0317 01:17:13.025121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca"} Mar 17 01:17:13 crc kubenswrapper[4735]: I0317 01:17:13.025752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9"} Mar 17 01:17:13 crc kubenswrapper[4735]: I0317 01:17:13.025794 4735 scope.go:117] "RemoveContainer" containerID="b791a849db8a9648cb1f45135930a9b8d420a4af065f024316f762e7a4828e2b" Mar 17 01:17:18 crc kubenswrapper[4735]: I0317 01:17:18.878441 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:17:18 crc kubenswrapper[4735]: I0317 01:17:18.947295 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6nl7" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.413715 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" podUID="85e67779-627e-4c7b-8105-8bb93f10ec15" containerName="registry" containerID="cri-o://ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00" gracePeriod=30 Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.804635 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900707 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900789 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900818 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs75q\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900883 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900909 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.900963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.901187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"85e67779-627e-4c7b-8105-8bb93f10ec15\" (UID: \"85e67779-627e-4c7b-8105-8bb93f10ec15\") " Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.903362 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.904291 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.910473 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.910820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q" (OuterVolumeSpecName: "kube-api-access-xs75q") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "kube-api-access-xs75q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.911046 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.913143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.914486 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4735]: I0317 01:17:31.925597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "85e67779-627e-4c7b-8105-8bb93f10ec15" (UID: "85e67779-627e-4c7b-8105-8bb93f10ec15"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.002839 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003283 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003304 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003325 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85e67779-627e-4c7b-8105-8bb93f10ec15-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003342 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003359 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85e67779-627e-4c7b-8105-8bb93f10ec15-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.003456 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs75q\" (UniqueName: \"kubernetes.io/projected/85e67779-627e-4c7b-8105-8bb93f10ec15-kube-api-access-xs75q\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.186441 4735 generic.go:334] "Generic (PLEG): container finished" podID="85e67779-627e-4c7b-8105-8bb93f10ec15" containerID="ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00" exitCode=0 Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.186563 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.186803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" event={"ID":"85e67779-627e-4c7b-8105-8bb93f10ec15","Type":"ContainerDied","Data":"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00"} Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.187114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d9dqr" event={"ID":"85e67779-627e-4c7b-8105-8bb93f10ec15","Type":"ContainerDied","Data":"32c1b4483dfcc924cde4b3e3ca2301718aec8ec553f7170f3f0c4464272d9368"} Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.187193 4735 scope.go:117] "RemoveContainer" containerID="ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.221078 4735 scope.go:117] "RemoveContainer" containerID="ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00" Mar 17 01:17:32 crc kubenswrapper[4735]: E0317 01:17:32.221651 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00\": container with ID starting with ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00 not found: ID does not exist" containerID="ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.221714 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00"} err="failed to get container status \"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00\": rpc error: code = NotFound desc = could not find container \"ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00\": container with ID starting with ea8dd25c43410c2eb0f8d19621bc4be7b93675b820adc4c5bfe26382d49e8a00 not found: ID does not exist" Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.239103 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:17:32 crc kubenswrapper[4735]: I0317 01:17:32.245123 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d9dqr"] Mar 17 01:17:33 crc kubenswrapper[4735]: I0317 01:17:33.081904 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e67779-627e-4c7b-8105-8bb93f10ec15" path="/var/lib/kubelet/pods/85e67779-627e-4c7b-8105-8bb93f10ec15/volumes" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.142146 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561838-298bz"] Mar 17 01:18:00 crc kubenswrapper[4735]: E0317 01:18:00.143047 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e67779-627e-4c7b-8105-8bb93f10ec15" containerName="registry" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.143068 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e67779-627e-4c7b-8105-8bb93f10ec15" containerName="registry" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.143286 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e67779-627e-4c7b-8105-8bb93f10ec15" containerName="registry" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.143845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.146497 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.152395 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.152572 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.155598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-298bz"] Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.271838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8vtp\" (UniqueName: \"kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp\") pod \"auto-csr-approver-29561838-298bz\" (UID: \"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176\") " pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.374084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8vtp\" (UniqueName: \"kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp\") pod \"auto-csr-approver-29561838-298bz\" (UID: \"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176\") " pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.401296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8vtp\" (UniqueName: \"kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp\") pod \"auto-csr-approver-29561838-298bz\" (UID: \"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176\") " pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.471530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:00 crc kubenswrapper[4735]: I0317 01:18:00.756376 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-298bz"] Mar 17 01:18:01 crc kubenswrapper[4735]: I0317 01:18:01.360845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-298bz" event={"ID":"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176","Type":"ContainerStarted","Data":"073d5b8663157d01a2f192b8471de557666ce616fe3e24a57cd81f29f0197f62"} Mar 17 01:18:02 crc kubenswrapper[4735]: I0317 01:18:02.369434 4735 generic.go:334] "Generic (PLEG): container finished" podID="4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" containerID="1a612cfa9aafce66665a794064c741dec915e254c572dab5974f01aabe68560e" exitCode=0 Mar 17 01:18:02 crc kubenswrapper[4735]: I0317 01:18:02.369552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-298bz" event={"ID":"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176","Type":"ContainerDied","Data":"1a612cfa9aafce66665a794064c741dec915e254c572dab5974f01aabe68560e"} Mar 17 01:18:03 crc kubenswrapper[4735]: I0317 01:18:03.678165 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:03 crc kubenswrapper[4735]: I0317 01:18:03.828369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8vtp\" (UniqueName: \"kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp\") pod \"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176\" (UID: \"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176\") " Mar 17 01:18:03 crc kubenswrapper[4735]: I0317 01:18:03.837215 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp" (OuterVolumeSpecName: "kube-api-access-z8vtp") pod "4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" (UID: "4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176"). InnerVolumeSpecName "kube-api-access-z8vtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:18:03 crc kubenswrapper[4735]: I0317 01:18:03.932761 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8vtp\" (UniqueName: \"kubernetes.io/projected/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176-kube-api-access-z8vtp\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:04 crc kubenswrapper[4735]: I0317 01:18:04.385263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-298bz" event={"ID":"4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176","Type":"ContainerDied","Data":"073d5b8663157d01a2f192b8471de557666ce616fe3e24a57cd81f29f0197f62"} Mar 17 01:18:04 crc kubenswrapper[4735]: I0317 01:18:04.385329 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="073d5b8663157d01a2f192b8471de557666ce616fe3e24a57cd81f29f0197f62" Mar 17 01:18:04 crc kubenswrapper[4735]: I0317 01:18:04.385379 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-298bz" Mar 17 01:18:04 crc kubenswrapper[4735]: I0317 01:18:04.764459 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-p6vx6"] Mar 17 01:18:04 crc kubenswrapper[4735]: I0317 01:18:04.770450 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-p6vx6"] Mar 17 01:18:05 crc kubenswrapper[4735]: I0317 01:18:05.085021 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4bd5744-869c-4763-af43-3ffcce4d549f" path="/var/lib/kubelet/pods/c4bd5744-869c-4763-af43-3ffcce4d549f/volumes" Mar 17 01:19:12 crc kubenswrapper[4735]: I0317 01:19:12.606464 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:19:12 crc kubenswrapper[4735]: I0317 01:19:12.607315 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:19:42 crc kubenswrapper[4735]: I0317 01:19:42.606372 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:19:42 crc kubenswrapper[4735]: I0317 01:19:42.607140 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.137780 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561840-gsft7"] Mar 17 01:20:00 crc kubenswrapper[4735]: E0317 01:20:00.138685 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" containerName="oc" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.138709 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" containerName="oc" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.138973 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" containerName="oc" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.139553 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.148803 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.149307 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.149739 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.152130 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-gsft7"] Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.177915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4pgq\" (UniqueName: \"kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq\") pod \"auto-csr-approver-29561840-gsft7\" (UID: \"7b61c1fb-1ab0-4aac-9086-c665a8521904\") " pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.278380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4pgq\" (UniqueName: \"kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq\") pod \"auto-csr-approver-29561840-gsft7\" (UID: \"7b61c1fb-1ab0-4aac-9086-c665a8521904\") " pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.300835 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4pgq\" (UniqueName: \"kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq\") pod \"auto-csr-approver-29561840-gsft7\" (UID: \"7b61c1fb-1ab0-4aac-9086-c665a8521904\") " pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.466417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.720339 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-gsft7"] Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.731400 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:20:00 crc kubenswrapper[4735]: I0317 01:20:00.786515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-gsft7" event={"ID":"7b61c1fb-1ab0-4aac-9086-c665a8521904","Type":"ContainerStarted","Data":"f5f48d6ddb4e5238e1d25ce17114068b2be1920e4453d58c8cf9c0df061f6898"} Mar 17 01:20:02 crc kubenswrapper[4735]: I0317 01:20:02.811997 4735 generic.go:334] "Generic (PLEG): container finished" podID="7b61c1fb-1ab0-4aac-9086-c665a8521904" containerID="9016f45133340a0c4eea3a5c2ef494fa622ee1ba2e7d882d048ebcd6930a55b8" exitCode=0 Mar 17 01:20:02 crc kubenswrapper[4735]: I0317 01:20:02.812013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-gsft7" event={"ID":"7b61c1fb-1ab0-4aac-9086-c665a8521904","Type":"ContainerDied","Data":"9016f45133340a0c4eea3a5c2ef494fa622ee1ba2e7d882d048ebcd6930a55b8"} Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.154475 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.334920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4pgq\" (UniqueName: \"kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq\") pod \"7b61c1fb-1ab0-4aac-9086-c665a8521904\" (UID: \"7b61c1fb-1ab0-4aac-9086-c665a8521904\") " Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.349176 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq" (OuterVolumeSpecName: "kube-api-access-r4pgq") pod "7b61c1fb-1ab0-4aac-9086-c665a8521904" (UID: "7b61c1fb-1ab0-4aac-9086-c665a8521904"). InnerVolumeSpecName "kube-api-access-r4pgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.438397 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4pgq\" (UniqueName: \"kubernetes.io/projected/7b61c1fb-1ab0-4aac-9086-c665a8521904-kube-api-access-r4pgq\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.831099 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-gsft7" event={"ID":"7b61c1fb-1ab0-4aac-9086-c665a8521904","Type":"ContainerDied","Data":"f5f48d6ddb4e5238e1d25ce17114068b2be1920e4453d58c8cf9c0df061f6898"} Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.831157 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f48d6ddb4e5238e1d25ce17114068b2be1920e4453d58c8cf9c0df061f6898" Mar 17 01:20:04 crc kubenswrapper[4735]: I0317 01:20:04.831189 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-gsft7" Mar 17 01:20:05 crc kubenswrapper[4735]: I0317 01:20:05.246129 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-t7858"] Mar 17 01:20:05 crc kubenswrapper[4735]: I0317 01:20:05.256188 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-t7858"] Mar 17 01:20:07 crc kubenswrapper[4735]: I0317 01:20:07.088168 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d726aceb-cffe-4667-9136-795a1442e125" path="/var/lib/kubelet/pods/d726aceb-cffe-4667-9136-795a1442e125/volumes" Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.606652 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.607313 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.607366 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.608130 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.608214 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9" gracePeriod=600 Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.909365 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9" exitCode=0 Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.909418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9"} Mar 17 01:20:12 crc kubenswrapper[4735]: I0317 01:20:12.909472 4735 scope.go:117] "RemoveContainer" containerID="a5f22d21e238a9e73749f0d0aab48f2d631b711d6cf4f090c4834f75705121ca" Mar 17 01:20:13 crc kubenswrapper[4735]: I0317 01:20:13.920958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a"} Mar 17 01:20:35 crc kubenswrapper[4735]: I0317 01:20:35.507462 4735 scope.go:117] "RemoveContainer" containerID="4c2ae0a64a50cfc874ce1cb1c4c7af1fc8c6f946744975423d57f0681d565aeb" Mar 17 01:20:35 crc kubenswrapper[4735]: I0317 01:20:35.560718 4735 scope.go:117] "RemoveContainer" containerID="cee9fac3b6f9b8dde02e0a30a2a9c05b4f0af2523f31e5881aca57a1c632018d" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.111829 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl"] Mar 17 01:21:08 crc kubenswrapper[4735]: E0317 01:21:08.112441 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b61c1fb-1ab0-4aac-9086-c665a8521904" containerName="oc" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.112452 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b61c1fb-1ab0-4aac-9086-c665a8521904" containerName="oc" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.112543 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b61c1fb-1ab0-4aac-9086-c665a8521904" containerName="oc" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.112883 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.115918 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cwxd7" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.116055 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.116153 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.123362 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.126755 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-4t98c"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.127475 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4t98c" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.142006 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-l9lnz" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.148696 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g9frk"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.149630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.153959 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4t98c"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.169999 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lcqm6" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.177573 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g9frk"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.288577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqvl\" (UniqueName: \"kubernetes.io/projected/f026bd7d-4093-433f-b42a-e2f88dbd2c7f-kube-api-access-hnqvl\") pod \"cert-manager-cainjector-cf98fcc89-5g5gl\" (UID: \"f026bd7d-4093-433f-b42a-e2f88dbd2c7f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.288624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgpcv\" (UniqueName: \"kubernetes.io/projected/2c5a87ba-17e0-4807-980f-f42af0ffb51a-kube-api-access-sgpcv\") pod \"cert-manager-webhook-687f57d79b-g9frk\" (UID: \"2c5a87ba-17e0-4807-980f-f42af0ffb51a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.288679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqzn7\" (UniqueName: \"kubernetes.io/projected/a4d0b9dc-dc40-431d-9fc3-8a45378faaf9-kube-api-access-rqzn7\") pod \"cert-manager-858654f9db-4t98c\" (UID: \"a4d0b9dc-dc40-431d-9fc3-8a45378faaf9\") " pod="cert-manager/cert-manager-858654f9db-4t98c" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.390085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqzn7\" (UniqueName: \"kubernetes.io/projected/a4d0b9dc-dc40-431d-9fc3-8a45378faaf9-kube-api-access-rqzn7\") pod \"cert-manager-858654f9db-4t98c\" (UID: \"a4d0b9dc-dc40-431d-9fc3-8a45378faaf9\") " pod="cert-manager/cert-manager-858654f9db-4t98c" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.390334 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqvl\" (UniqueName: \"kubernetes.io/projected/f026bd7d-4093-433f-b42a-e2f88dbd2c7f-kube-api-access-hnqvl\") pod \"cert-manager-cainjector-cf98fcc89-5g5gl\" (UID: \"f026bd7d-4093-433f-b42a-e2f88dbd2c7f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.390412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgpcv\" (UniqueName: \"kubernetes.io/projected/2c5a87ba-17e0-4807-980f-f42af0ffb51a-kube-api-access-sgpcv\") pod \"cert-manager-webhook-687f57d79b-g9frk\" (UID: \"2c5a87ba-17e0-4807-980f-f42af0ffb51a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.412703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqzn7\" (UniqueName: \"kubernetes.io/projected/a4d0b9dc-dc40-431d-9fc3-8a45378faaf9-kube-api-access-rqzn7\") pod \"cert-manager-858654f9db-4t98c\" (UID: \"a4d0b9dc-dc40-431d-9fc3-8a45378faaf9\") " pod="cert-manager/cert-manager-858654f9db-4t98c" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.413905 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqvl\" (UniqueName: \"kubernetes.io/projected/f026bd7d-4093-433f-b42a-e2f88dbd2c7f-kube-api-access-hnqvl\") pod \"cert-manager-cainjector-cf98fcc89-5g5gl\" (UID: \"f026bd7d-4093-433f-b42a-e2f88dbd2c7f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.415342 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgpcv\" (UniqueName: \"kubernetes.io/projected/2c5a87ba-17e0-4807-980f-f42af0ffb51a-kube-api-access-sgpcv\") pod \"cert-manager-webhook-687f57d79b-g9frk\" (UID: \"2c5a87ba-17e0-4807-980f-f42af0ffb51a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.433090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.459983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4t98c" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.460767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.710048 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl"] Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.754084 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g9frk"] Mar 17 01:21:08 crc kubenswrapper[4735]: W0317 01:21:08.757979 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c5a87ba_17e0_4807_980f_f42af0ffb51a.slice/crio-7db567777b792a579e9315bdb8efeedf99ebc6ade1918cdc7fbf7edb78a72eb8 WatchSource:0}: Error finding container 7db567777b792a579e9315bdb8efeedf99ebc6ade1918cdc7fbf7edb78a72eb8: Status 404 returned error can't find the container with id 7db567777b792a579e9315bdb8efeedf99ebc6ade1918cdc7fbf7edb78a72eb8 Mar 17 01:21:08 crc kubenswrapper[4735]: I0317 01:21:08.875708 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4t98c"] Mar 17 01:21:09 crc kubenswrapper[4735]: I0317 01:21:09.303908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4t98c" event={"ID":"a4d0b9dc-dc40-431d-9fc3-8a45378faaf9","Type":"ContainerStarted","Data":"151360e4361dbeab45d0a75da2b45c60f9d46a970ae483996272bdaf35e61108"} Mar 17 01:21:09 crc kubenswrapper[4735]: I0317 01:21:09.305513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" event={"ID":"2c5a87ba-17e0-4807-980f-f42af0ffb51a","Type":"ContainerStarted","Data":"7db567777b792a579e9315bdb8efeedf99ebc6ade1918cdc7fbf7edb78a72eb8"} Mar 17 01:21:09 crc kubenswrapper[4735]: I0317 01:21:09.306887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" event={"ID":"f026bd7d-4093-433f-b42a-e2f88dbd2c7f","Type":"ContainerStarted","Data":"89111bddc0f3009c869703692743223559440ef818ea053ac17d694f7d0f61b8"} Mar 17 01:21:12 crc kubenswrapper[4735]: I0317 01:21:12.882369 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-9sppg" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" probeResult="failure" output=< Mar 17 01:21:12 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:21:12 crc kubenswrapper[4735]: > Mar 17 01:21:12 crc kubenswrapper[4735]: I0317 01:21:12.912773 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-9sppg" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" probeResult="failure" output=< Mar 17 01:21:12 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:21:12 crc kubenswrapper[4735]: > Mar 17 01:21:13 crc kubenswrapper[4735]: I0317 01:21:13.939509 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" event={"ID":"f026bd7d-4093-433f-b42a-e2f88dbd2c7f","Type":"ContainerStarted","Data":"950e439110ec7b879bf9b31f94b83e6b5926edabdb623f070c6dec3e15f2311e"} Mar 17 01:21:13 crc kubenswrapper[4735]: I0317 01:21:13.941743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" event={"ID":"2c5a87ba-17e0-4807-980f-f42af0ffb51a","Type":"ContainerStarted","Data":"9ff1215c5d8238f3483858e1beb7229105872a7c1539430e9c924ad3de55a7d1"} Mar 17 01:21:13 crc kubenswrapper[4735]: I0317 01:21:13.942447 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:13 crc kubenswrapper[4735]: I0317 01:21:13.961597 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5g5gl" podStartSLOduration=1.485453472 podStartE2EDuration="5.961565545s" podCreationTimestamp="2026-03-17 01:21:08 +0000 UTC" firstStartedPulling="2026-03-17 01:21:08.720020473 +0000 UTC m=+694.352253451" lastFinishedPulling="2026-03-17 01:21:13.196132556 +0000 UTC m=+698.828365524" observedRunningTime="2026-03-17 01:21:13.956719479 +0000 UTC m=+699.588952467" watchObservedRunningTime="2026-03-17 01:21:13.961565545 +0000 UTC m=+699.593798533" Mar 17 01:21:14 crc kubenswrapper[4735]: I0317 01:21:14.001877 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" podStartSLOduration=1.600569069 podStartE2EDuration="6.001833429s" podCreationTimestamp="2026-03-17 01:21:08 +0000 UTC" firstStartedPulling="2026-03-17 01:21:08.761418164 +0000 UTC m=+694.393651142" lastFinishedPulling="2026-03-17 01:21:13.162682524 +0000 UTC m=+698.794915502" observedRunningTime="2026-03-17 01:21:13.982233094 +0000 UTC m=+699.614466082" watchObservedRunningTime="2026-03-17 01:21:14.001833429 +0000 UTC m=+699.634066417" Mar 17 01:21:14 crc kubenswrapper[4735]: I0317 01:21:14.950441 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4t98c" event={"ID":"a4d0b9dc-dc40-431d-9fc3-8a45378faaf9","Type":"ContainerStarted","Data":"6144c692e6ecb84154343659ef62c46a7a2699759e33abc2ce6e5273dba01594"} Mar 17 01:21:14 crc kubenswrapper[4735]: I0317 01:21:14.975937 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-4t98c" podStartSLOduration=1.9616395899999999 podStartE2EDuration="6.975901548s" podCreationTimestamp="2026-03-17 01:21:08 +0000 UTC" firstStartedPulling="2026-03-17 01:21:08.882458921 +0000 UTC m=+694.514691899" lastFinishedPulling="2026-03-17 01:21:13.896720859 +0000 UTC m=+699.528953857" observedRunningTime="2026-03-17 01:21:14.972274932 +0000 UTC m=+700.604507960" watchObservedRunningTime="2026-03-17 01:21:14.975901548 +0000 UTC m=+700.608134556" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.017812 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5mhq"] Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026545 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-controller" containerID="cri-o://833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026607 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="nbdb" containerID="cri-o://9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026644 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="sbdb" containerID="cri-o://4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026706 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="northd" containerID="cri-o://9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026755 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026764 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-acl-logging" containerID="cri-o://a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.026812 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-node" containerID="cri-o://8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.068714 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" containerID="cri-o://6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" gracePeriod=30 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.290825 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/3.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.299203 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovn-acl-logging/0.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.300372 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovn-controller/0.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.301021 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350534 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jmfxl"] Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350718 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350730 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350738 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350744 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350750 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kubecfg-setup" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350756 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kubecfg-setup" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350763 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350769 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350777 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350782 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350791 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350798 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350808 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350814 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350822 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-node" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350828 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-node" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350835 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="nbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350841 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="nbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350868 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-acl-logging" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350873 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-acl-logging" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350880 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="northd" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350886 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="northd" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.350894 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="sbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.350900 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="sbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351000 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-node" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351012 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351018 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351025 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351033 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="northd" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351041 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351049 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351057 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="nbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351064 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovn-acl-logging" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351072 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="sbdb" Mar 17 01:21:18 crc kubenswrapper[4735]: E0317 01:21:18.351164 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351171 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351266 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.351276 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerName="ovnkube-controller" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.352743 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388409 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388490 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388505 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388530 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log" (OuterVolumeSpecName: "node-log") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388612 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388622 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388638 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388648 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388667 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket" (OuterVolumeSpecName: "log-socket") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388669 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhbd4\" (UniqueName: \"kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388729 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388803 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388826 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388935 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388969 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388985 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch\") pod \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\" (UID: \"5d25c473-740d-4af9-b5f7-72bfc5d911a4\") " Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-script-lib\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqv2v\" (UniqueName: \"kubernetes.io/projected/60fc517b-5f37-4bfa-a567-10c48a5c8d11-kube-api-access-jqv2v\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-ovn\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-log-socket\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-node-log\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-systemd-units\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-config\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovn-node-metrics-cert\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389253 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389294 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-etc-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-systemd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389326 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-netns\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-env-overrides\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-slash\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-kubelet\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-netd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-var-lib-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-bin\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389501 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389511 4735 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-node-log\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389519 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389528 4735 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-log-socket\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389537 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389545 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389554 4735 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389562 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389570 4735 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.388963 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389026 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389237 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.389749 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash" (OuterVolumeSpecName: "host-slash") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.390224 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.390248 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.394409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4" (OuterVolumeSpecName: "kube-api-access-fhbd4") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "kube-api-access-fhbd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.394771 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.403233 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5d25c473-740d-4af9-b5f7-72bfc5d911a4" (UID: "5d25c473-740d-4af9-b5f7-72bfc5d911a4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.463399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqv2v\" (UniqueName: \"kubernetes.io/projected/60fc517b-5f37-4bfa-a567-10c48a5c8d11-kube-api-access-jqv2v\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-ovn\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491212 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-log-socket\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-node-log\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-systemd-units\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovn-node-metrics-cert\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-config\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-etc-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491435 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-systemd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-netns\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-env-overrides\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-slash\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491526 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491551 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-kubelet\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491571 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-netd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-var-lib-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-bin\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-script-lib\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491697 4735 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491710 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491723 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491734 4735 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491747 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhbd4\" (UniqueName: \"kubernetes.io/projected/5d25c473-740d-4af9-b5f7-72bfc5d911a4-kube-api-access-fhbd4\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491758 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491769 4735 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-slash\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491781 4735 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491792 4735 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491804 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d25c473-740d-4af9-b5f7-72bfc5d911a4-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.491815 4735 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d25c473-740d-4af9-b5f7-72bfc5d911a4-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-kubelet\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-log-socket\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492231 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-var-lib-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492262 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-slash\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-etc-openvswitch\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-netd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-ovn\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-run-systemd\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-node-log\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-cni-bin\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492376 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-systemd-units\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492144 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60fc517b-5f37-4bfa-a567-10c48a5c8d11-host-run-netns\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.492697 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-env-overrides\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.493218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-config\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.493383 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovnkube-script-lib\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.497011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60fc517b-5f37-4bfa-a567-10c48a5c8d11-ovn-node-metrics-cert\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.516198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqv2v\" (UniqueName: \"kubernetes.io/projected/60fc517b-5f37-4bfa-a567-10c48a5c8d11-kube-api-access-jqv2v\") pod \"ovnkube-node-jmfxl\" (UID: \"60fc517b-5f37-4bfa-a567-10c48a5c8d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.668486 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:18 crc kubenswrapper[4735]: W0317 01:21:18.689965 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60fc517b_5f37_4bfa_a567_10c48a5c8d11.slice/crio-ffe8fe946abb781cc4cbc1f9b09dd09fb49565772f8c25b3ab9a2791934657b8 WatchSource:0}: Error finding container ffe8fe946abb781cc4cbc1f9b09dd09fb49565772f8c25b3ab9a2791934657b8: Status 404 returned error can't find the container with id ffe8fe946abb781cc4cbc1f9b09dd09fb49565772f8c25b3ab9a2791934657b8 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.982510 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovnkube-controller/3.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.987037 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovn-acl-logging/0.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.987750 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5mhq_5d25c473-740d-4af9-b5f7-72bfc5d911a4/ovn-controller/0.log" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988210 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988265 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988289 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988312 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988326 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988340 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988354 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" exitCode=143 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988367 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" exitCode=143 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988572 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988591 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988608 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988619 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988631 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988642 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988653 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988663 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988674 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988685 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988717 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988730 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988741 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988752 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988763 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988773 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988784 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988795 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988805 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988816 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988846 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988892 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988904 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988915 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988925 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988936 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988946 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988957 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988968 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988979 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.988993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" event={"ID":"5d25c473-740d-4af9-b5f7-72bfc5d911a4","Type":"ContainerDied","Data":"2f60c084541cd59fca628ae9acd4f0ca315f093783b0040c8723163107ba4d6a"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989009 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989021 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989031 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989042 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989053 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989063 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989074 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989085 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989095 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989108 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989133 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.989356 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5mhq" Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.997798 4735 generic.go:334] "Generic (PLEG): container finished" podID="60fc517b-5f37-4bfa-a567-10c48a5c8d11" containerID="9b44aa7aa7804358ea7b07afaad47d0e629f271d60f07567bf09b19b2a4739db" exitCode=0 Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.997996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerDied","Data":"9b44aa7aa7804358ea7b07afaad47d0e629f271d60f07567bf09b19b2a4739db"} Mar 17 01:21:18 crc kubenswrapper[4735]: I0317 01:21:18.998044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"ffe8fe946abb781cc4cbc1f9b09dd09fb49565772f8c25b3ab9a2791934657b8"} Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.002333 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/2.log" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.003300 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/1.log" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.003369 4735 generic.go:334] "Generic (PLEG): container finished" podID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" containerID="98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf" exitCode=2 Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.003409 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerDied","Data":"98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf"} Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.003439 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1"} Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.004080 4735 scope.go:117] "RemoveContainer" containerID="98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.004377 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mm58f_openshift-multus(a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d)\"" pod="openshift-multus/multus-mm58f" podUID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.039247 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.093327 4735 scope.go:117] "RemoveContainer" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.121070 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5mhq"] Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.121115 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5mhq"] Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.124411 4735 scope.go:117] "RemoveContainer" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.138592 4735 scope.go:117] "RemoveContainer" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.162656 4735 scope.go:117] "RemoveContainer" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.197151 4735 scope.go:117] "RemoveContainer" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.227684 4735 scope.go:117] "RemoveContainer" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.252828 4735 scope.go:117] "RemoveContainer" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.276839 4735 scope.go:117] "RemoveContainer" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.297456 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.297924 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.297969 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} err="failed to get container status \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.298000 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.298519 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": container with ID starting with 45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0 not found: ID does not exist" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.298554 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} err="failed to get container status \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": rpc error: code = NotFound desc = could not find container \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": container with ID starting with 45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.298577 4735 scope.go:117] "RemoveContainer" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.299231 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": container with ID starting with 4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da not found: ID does not exist" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.299261 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} err="failed to get container status \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": rpc error: code = NotFound desc = could not find container \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": container with ID starting with 4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.299276 4735 scope.go:117] "RemoveContainer" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.301062 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": container with ID starting with 9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8 not found: ID does not exist" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301087 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} err="failed to get container status \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": rpc error: code = NotFound desc = could not find container \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": container with ID starting with 9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301100 4735 scope.go:117] "RemoveContainer" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.301317 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": container with ID starting with 9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea not found: ID does not exist" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301342 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} err="failed to get container status \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": rpc error: code = NotFound desc = could not find container \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": container with ID starting with 9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301362 4735 scope.go:117] "RemoveContainer" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.301560 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": container with ID starting with 77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac not found: ID does not exist" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301584 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} err="failed to get container status \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": rpc error: code = NotFound desc = could not find container \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": container with ID starting with 77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301599 4735 scope.go:117] "RemoveContainer" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.301831 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": container with ID starting with 8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac not found: ID does not exist" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301878 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} err="failed to get container status \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": rpc error: code = NotFound desc = could not find container \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": container with ID starting with 8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.301904 4735 scope.go:117] "RemoveContainer" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.302121 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": container with ID starting with a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d not found: ID does not exist" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302146 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} err="failed to get container status \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": rpc error: code = NotFound desc = could not find container \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": container with ID starting with a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302161 4735 scope.go:117] "RemoveContainer" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.302363 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": container with ID starting with 833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38 not found: ID does not exist" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302384 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} err="failed to get container status \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": rpc error: code = NotFound desc = could not find container \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": container with ID starting with 833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302398 4735 scope.go:117] "RemoveContainer" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: E0317 01:21:19.302594 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": container with ID starting with b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7 not found: ID does not exist" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302613 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} err="failed to get container status \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": rpc error: code = NotFound desc = could not find container \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": container with ID starting with b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302624 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302813 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} err="failed to get container status \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.302828 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303040 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} err="failed to get container status \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": rpc error: code = NotFound desc = could not find container \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": container with ID starting with 45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303060 4735 scope.go:117] "RemoveContainer" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303269 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} err="failed to get container status \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": rpc error: code = NotFound desc = could not find container \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": container with ID starting with 4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303287 4735 scope.go:117] "RemoveContainer" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303538 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} err="failed to get container status \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": rpc error: code = NotFound desc = could not find container \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": container with ID starting with 9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303576 4735 scope.go:117] "RemoveContainer" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303842 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} err="failed to get container status \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": rpc error: code = NotFound desc = could not find container \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": container with ID starting with 9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.303880 4735 scope.go:117] "RemoveContainer" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304073 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} err="failed to get container status \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": rpc error: code = NotFound desc = could not find container \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": container with ID starting with 77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304103 4735 scope.go:117] "RemoveContainer" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304301 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} err="failed to get container status \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": rpc error: code = NotFound desc = could not find container \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": container with ID starting with 8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304378 4735 scope.go:117] "RemoveContainer" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304593 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} err="failed to get container status \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": rpc error: code = NotFound desc = could not find container \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": container with ID starting with a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304622 4735 scope.go:117] "RemoveContainer" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304829 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} err="failed to get container status \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": rpc error: code = NotFound desc = could not find container \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": container with ID starting with 833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.304877 4735 scope.go:117] "RemoveContainer" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305098 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} err="failed to get container status \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": rpc error: code = NotFound desc = could not find container \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": container with ID starting with b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305121 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305337 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} err="failed to get container status \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305362 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305588 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} err="failed to get container status \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": rpc error: code = NotFound desc = could not find container \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": container with ID starting with 45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305615 4735 scope.go:117] "RemoveContainer" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305811 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} err="failed to get container status \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": rpc error: code = NotFound desc = could not find container \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": container with ID starting with 4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.305833 4735 scope.go:117] "RemoveContainer" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306139 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} err="failed to get container status \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": rpc error: code = NotFound desc = could not find container \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": container with ID starting with 9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306162 4735 scope.go:117] "RemoveContainer" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306378 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} err="failed to get container status \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": rpc error: code = NotFound desc = could not find container \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": container with ID starting with 9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306403 4735 scope.go:117] "RemoveContainer" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306821 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} err="failed to get container status \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": rpc error: code = NotFound desc = could not find container \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": container with ID starting with 77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.306847 4735 scope.go:117] "RemoveContainer" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307131 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} err="failed to get container status \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": rpc error: code = NotFound desc = could not find container \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": container with ID starting with 8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307158 4735 scope.go:117] "RemoveContainer" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307343 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} err="failed to get container status \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": rpc error: code = NotFound desc = could not find container \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": container with ID starting with a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307369 4735 scope.go:117] "RemoveContainer" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307566 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} err="failed to get container status \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": rpc error: code = NotFound desc = could not find container \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": container with ID starting with 833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.307596 4735 scope.go:117] "RemoveContainer" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.308480 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} err="failed to get container status \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": rpc error: code = NotFound desc = could not find container \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": container with ID starting with b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.308508 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.308711 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} err="failed to get container status \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.308738 4735 scope.go:117] "RemoveContainer" containerID="45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.309587 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0"} err="failed to get container status \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": rpc error: code = NotFound desc = could not find container \"45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0\": container with ID starting with 45ede831b99186964f6054631f935e5c736ea25eb83c96cc979fdddf493ae7a0 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.309614 4735 scope.go:117] "RemoveContainer" containerID="4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.309823 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da"} err="failed to get container status \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": rpc error: code = NotFound desc = could not find container \"4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da\": container with ID starting with 4160de5b7bec381a0260d8a8ef8299ebb8aaec280071d718cdf25da5b10540da not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.309849 4735 scope.go:117] "RemoveContainer" containerID="9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310048 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8"} err="failed to get container status \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": rpc error: code = NotFound desc = could not find container \"9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8\": container with ID starting with 9a22b1e631cadb258d563a7572dd5af23b2a53237107e43193fdd87167fdd8f8 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310074 4735 scope.go:117] "RemoveContainer" containerID="9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310298 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea"} err="failed to get container status \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": rpc error: code = NotFound desc = could not find container \"9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea\": container with ID starting with 9d9b112a57b42bc78159766f8305f98b84d9379ebe39fae4ceab0c16873fa1ea not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310329 4735 scope.go:117] "RemoveContainer" containerID="77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310518 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac"} err="failed to get container status \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": rpc error: code = NotFound desc = could not find container \"77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac\": container with ID starting with 77349111e6cc76de8aa4a41d28257a995dc788c7991663868851154d66d286ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310547 4735 scope.go:117] "RemoveContainer" containerID="8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310782 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac"} err="failed to get container status \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": rpc error: code = NotFound desc = could not find container \"8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac\": container with ID starting with 8fe8c0ee42d87c7f5551e0eceee2fd843128624e3119bca95f893185e803a0ac not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.310805 4735 scope.go:117] "RemoveContainer" containerID="a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311026 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d"} err="failed to get container status \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": rpc error: code = NotFound desc = could not find container \"a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d\": container with ID starting with a45b229c964b3287578573cc14f01a220f6ef60579a37d58235141f39deb460d not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311050 4735 scope.go:117] "RemoveContainer" containerID="833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311248 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38"} err="failed to get container status \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": rpc error: code = NotFound desc = could not find container \"833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38\": container with ID starting with 833cdeda73c3b2467ccc3c81fc4243ea8ac7256bd82275a1846b01cf7bb75d38 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311272 4735 scope.go:117] "RemoveContainer" containerID="b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311456 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7"} err="failed to get container status \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": rpc error: code = NotFound desc = could not find container \"b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7\": container with ID starting with b6df1353bbad289d27f1d70abba2d563075a05d1c3845f369c7ba1024ed5eba7 not found: ID does not exist" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311480 4735 scope.go:117] "RemoveContainer" containerID="6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5" Mar 17 01:21:19 crc kubenswrapper[4735]: I0317 01:21:19.311695 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5"} err="failed to get container status \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": rpc error: code = NotFound desc = could not find container \"6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5\": container with ID starting with 6d18948da525ae11ec2129a1caf73e22d490d436081a5d2a9b4ea60206c829a5 not found: ID does not exist" Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020191 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"2e64f8dad2808e64a32fe81f6d0bb8698c64a5c07053a297a28055e6f916ff9e"} Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"4c5bf7c314fa57dc969cfabdf58344bbbf523dc69de875898ab8413dd5708acb"} Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"8a4434730ca2faaa772d107a40bf52f1ec3fce7fd4e3dc7d23cce8c57bdeb4c6"} Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"d204f247f8b87bb728090ca79013c17e301b9678899214b6d4c6dc224286281e"} Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"8d294de14c107c97b837818499338d2cebbdcf8506acada0d8405c7c2024f6d5"} Mar 17 01:21:20 crc kubenswrapper[4735]: I0317 01:21:20.020262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"c6a05f4548a759d5d54972affa26adf27db872886872d0f7dc7dc36cec126696"} Mar 17 01:21:21 crc kubenswrapper[4735]: I0317 01:21:21.084410 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d25c473-740d-4af9-b5f7-72bfc5d911a4" path="/var/lib/kubelet/pods/5d25c473-740d-4af9-b5f7-72bfc5d911a4/volumes" Mar 17 01:21:23 crc kubenswrapper[4735]: I0317 01:21:23.047348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"d8792042015baec7518c338af5c8b1554033fb29e8f5592945377940d9c1b29b"} Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.069503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" event={"ID":"60fc517b-5f37-4bfa-a567-10c48a5c8d11","Type":"ContainerStarted","Data":"42ed3be5b715b7236eca880f2d0b8628f69f83d01c5efc0b5dd17224fe961a04"} Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.070451 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.070486 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.070510 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.109956 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" podStartSLOduration=7.109937373 podStartE2EDuration="7.109937373s" podCreationTimestamp="2026-03-17 01:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:21:25.105810495 +0000 UTC m=+710.738043513" watchObservedRunningTime="2026-03-17 01:21:25.109937373 +0000 UTC m=+710.742170351" Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.135751 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:25 crc kubenswrapper[4735]: I0317 01:21:25.135815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:21:31 crc kubenswrapper[4735]: I0317 01:21:31.074954 4735 scope.go:117] "RemoveContainer" containerID="98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf" Mar 17 01:21:31 crc kubenswrapper[4735]: E0317 01:21:31.076052 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mm58f_openshift-multus(a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d)\"" pod="openshift-multus/multus-mm58f" podUID="a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d" Mar 17 01:21:35 crc kubenswrapper[4735]: I0317 01:21:35.629850 4735 scope.go:117] "RemoveContainer" containerID="645cc4a7987db8ab1e24af04b4cf5a310941ae5029ab3defb7e91ad369a46cd0" Mar 17 01:21:35 crc kubenswrapper[4735]: I0317 01:21:35.668845 4735 scope.go:117] "RemoveContainer" containerID="e0fd226f976b9d4c6ac83a9e840df8bec34ddffd2f501e1cf27272113854adb1" Mar 17 01:21:36 crc kubenswrapper[4735]: I0317 01:21:36.152946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/2.log" Mar 17 01:21:46 crc kubenswrapper[4735]: I0317 01:21:46.073806 4735 scope.go:117] "RemoveContainer" containerID="98b9f889d1e171e0f5a13534a10cf074ad341068dd1ccb49e17cd309da51ecaf" Mar 17 01:21:47 crc kubenswrapper[4735]: I0317 01:21:47.222821 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm58f_a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d/kube-multus/2.log" Mar 17 01:21:47 crc kubenswrapper[4735]: I0317 01:21:47.223211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm58f" event={"ID":"a97aac3d-ac90-4e7e-b0e4-7b668ba15f6d","Type":"ContainerStarted","Data":"54f667584c4dc26348a5dc7988244bf0db9afe55cb1f78a11e6b77619ce53450"} Mar 17 01:21:48 crc kubenswrapper[4735]: I0317 01:21:48.715437 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmfxl" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.161104 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561842-kckdb"] Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.162901 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.166679 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.166894 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.167162 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.173139 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-kckdb"] Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.247288 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxv7j\" (UniqueName: \"kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j\") pod \"auto-csr-approver-29561842-kckdb\" (UID: \"2266c1f2-ce32-49ff-83aa-174c9ce402c2\") " pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.349099 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxv7j\" (UniqueName: \"kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j\") pod \"auto-csr-approver-29561842-kckdb\" (UID: \"2266c1f2-ce32-49ff-83aa-174c9ce402c2\") " pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.384104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxv7j\" (UniqueName: \"kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j\") pod \"auto-csr-approver-29561842-kckdb\" (UID: \"2266c1f2-ce32-49ff-83aa-174c9ce402c2\") " pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.492020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:00 crc kubenswrapper[4735]: I0317 01:22:00.945904 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-kckdb"] Mar 17 01:22:01 crc kubenswrapper[4735]: I0317 01:22:01.326034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-kckdb" event={"ID":"2266c1f2-ce32-49ff-83aa-174c9ce402c2","Type":"ContainerStarted","Data":"caad03c9f5ddbf3dc1f1fc80b596781855949a01354e15a52cad4609ce57aba0"} Mar 17 01:22:02 crc kubenswrapper[4735]: I0317 01:22:02.982688 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7"] Mar 17 01:22:02 crc kubenswrapper[4735]: I0317 01:22:02.983904 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:02 crc kubenswrapper[4735]: I0317 01:22:02.997891 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7"] Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.000882 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.089631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.090616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhjs\" (UniqueName: \"kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.090742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.191876 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lhjs\" (UniqueName: \"kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.191938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.191984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.192745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.192754 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.230724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lhjs\" (UniqueName: \"kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.297180 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.356725 4735 generic.go:334] "Generic (PLEG): container finished" podID="2266c1f2-ce32-49ff-83aa-174c9ce402c2" containerID="625dba166345dcf9adcc326bf6914bd654a08fbe8db12eb808ece852ba6163fc" exitCode=0 Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.356789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-kckdb" event={"ID":"2266c1f2-ce32-49ff-83aa-174c9ce402c2","Type":"ContainerDied","Data":"625dba166345dcf9adcc326bf6914bd654a08fbe8db12eb808ece852ba6163fc"} Mar 17 01:22:03 crc kubenswrapper[4735]: I0317 01:22:03.578842 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7"] Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.367751 4735 generic.go:334] "Generic (PLEG): container finished" podID="237e71f6-eba3-4e8b-a219-18685f510184" containerID="82572b8908108673c665e936f342c4627f857c26a9a83ffc3b603e4aa2cd75bc" exitCode=0 Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.367845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" event={"ID":"237e71f6-eba3-4e8b-a219-18685f510184","Type":"ContainerDied","Data":"82572b8908108673c665e936f342c4627f857c26a9a83ffc3b603e4aa2cd75bc"} Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.368348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" event={"ID":"237e71f6-eba3-4e8b-a219-18685f510184","Type":"ContainerStarted","Data":"d023cc786c1a59762b052cd07fab59e8159add97d59e90e84e3dc23137bd98d4"} Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.699932 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.719000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxv7j\" (UniqueName: \"kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j\") pod \"2266c1f2-ce32-49ff-83aa-174c9ce402c2\" (UID: \"2266c1f2-ce32-49ff-83aa-174c9ce402c2\") " Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.735314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j" (OuterVolumeSpecName: "kube-api-access-gxv7j") pod "2266c1f2-ce32-49ff-83aa-174c9ce402c2" (UID: "2266c1f2-ce32-49ff-83aa-174c9ce402c2"). InnerVolumeSpecName "kube-api-access-gxv7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:22:04 crc kubenswrapper[4735]: I0317 01:22:04.820596 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxv7j\" (UniqueName: \"kubernetes.io/projected/2266c1f2-ce32-49ff-83aa-174c9ce402c2-kube-api-access-gxv7j\") on node \"crc\" DevicePath \"\"" Mar 17 01:22:05 crc kubenswrapper[4735]: I0317 01:22:05.375653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-kckdb" event={"ID":"2266c1f2-ce32-49ff-83aa-174c9ce402c2","Type":"ContainerDied","Data":"caad03c9f5ddbf3dc1f1fc80b596781855949a01354e15a52cad4609ce57aba0"} Mar 17 01:22:05 crc kubenswrapper[4735]: I0317 01:22:05.376233 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caad03c9f5ddbf3dc1f1fc80b596781855949a01354e15a52cad4609ce57aba0" Mar 17 01:22:05 crc kubenswrapper[4735]: I0317 01:22:05.375709 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-kckdb" Mar 17 01:22:05 crc kubenswrapper[4735]: I0317 01:22:05.771229 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-7zvsl"] Mar 17 01:22:05 crc kubenswrapper[4735]: I0317 01:22:05.777157 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-7zvsl"] Mar 17 01:22:06 crc kubenswrapper[4735]: I0317 01:22:06.388921 4735 generic.go:334] "Generic (PLEG): container finished" podID="237e71f6-eba3-4e8b-a219-18685f510184" containerID="959cd2c741a50f4b2ede1452a507b5eb2614a8640dc44f90da7a28f04f71ed87" exitCode=0 Mar 17 01:22:06 crc kubenswrapper[4735]: I0317 01:22:06.389121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" event={"ID":"237e71f6-eba3-4e8b-a219-18685f510184","Type":"ContainerDied","Data":"959cd2c741a50f4b2ede1452a507b5eb2614a8640dc44f90da7a28f04f71ed87"} Mar 17 01:22:07 crc kubenswrapper[4735]: I0317 01:22:07.084139 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca4f1b3-ff9b-4e10-b32e-844d67c4ae48" path="/var/lib/kubelet/pods/cca4f1b3-ff9b-4e10-b32e-844d67c4ae48/volumes" Mar 17 01:22:07 crc kubenswrapper[4735]: I0317 01:22:07.401278 4735 generic.go:334] "Generic (PLEG): container finished" podID="237e71f6-eba3-4e8b-a219-18685f510184" containerID="58b406c7ca8bb015d8a13c264b83f746d7b3acb452e9317f331be844e2f3135f" exitCode=0 Mar 17 01:22:07 crc kubenswrapper[4735]: I0317 01:22:07.401340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" event={"ID":"237e71f6-eba3-4e8b-a219-18685f510184","Type":"ContainerDied","Data":"58b406c7ca8bb015d8a13c264b83f746d7b3acb452e9317f331be844e2f3135f"} Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.741951 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.772108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util\") pod \"237e71f6-eba3-4e8b-a219-18685f510184\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.775054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle\") pod \"237e71f6-eba3-4e8b-a219-18685f510184\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.775114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lhjs\" (UniqueName: \"kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs\") pod \"237e71f6-eba3-4e8b-a219-18685f510184\" (UID: \"237e71f6-eba3-4e8b-a219-18685f510184\") " Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.775507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle" (OuterVolumeSpecName: "bundle") pod "237e71f6-eba3-4e8b-a219-18685f510184" (UID: "237e71f6-eba3-4e8b-a219-18685f510184"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.775709 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.782963 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs" (OuterVolumeSpecName: "kube-api-access-9lhjs") pod "237e71f6-eba3-4e8b-a219-18685f510184" (UID: "237e71f6-eba3-4e8b-a219-18685f510184"). InnerVolumeSpecName "kube-api-access-9lhjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.799267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util" (OuterVolumeSpecName: "util") pod "237e71f6-eba3-4e8b-a219-18685f510184" (UID: "237e71f6-eba3-4e8b-a219-18685f510184"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.876704 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lhjs\" (UniqueName: \"kubernetes.io/projected/237e71f6-eba3-4e8b-a219-18685f510184-kube-api-access-9lhjs\") on node \"crc\" DevicePath \"\"" Mar 17 01:22:08 crc kubenswrapper[4735]: I0317 01:22:08.876739 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/237e71f6-eba3-4e8b-a219-18685f510184-util\") on node \"crc\" DevicePath \"\"" Mar 17 01:22:09 crc kubenswrapper[4735]: I0317 01:22:09.417570 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" event={"ID":"237e71f6-eba3-4e8b-a219-18685f510184","Type":"ContainerDied","Data":"d023cc786c1a59762b052cd07fab59e8159add97d59e90e84e3dc23137bd98d4"} Mar 17 01:22:09 crc kubenswrapper[4735]: I0317 01:22:09.418317 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d023cc786c1a59762b052cd07fab59e8159add97d59e90e84e3dc23137bd98d4" Mar 17 01:22:09 crc kubenswrapper[4735]: I0317 01:22:09.417659 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.627654 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn"] Mar 17 01:22:11 crc kubenswrapper[4735]: E0317 01:22:11.627939 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="extract" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.627955 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="extract" Mar 17 01:22:11 crc kubenswrapper[4735]: E0317 01:22:11.627970 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="pull" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.627978 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="pull" Mar 17 01:22:11 crc kubenswrapper[4735]: E0317 01:22:11.627992 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="util" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.628000 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="util" Mar 17 01:22:11 crc kubenswrapper[4735]: E0317 01:22:11.628009 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266c1f2-ce32-49ff-83aa-174c9ce402c2" containerName="oc" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.628016 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266c1f2-ce32-49ff-83aa-174c9ce402c2" containerName="oc" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.628124 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="237e71f6-eba3-4e8b-a219-18685f510184" containerName="extract" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.628141 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2266c1f2-ce32-49ff-83aa-174c9ce402c2" containerName="oc" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.628556 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.630830 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.631279 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8dr8d" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.631775 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.653343 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn"] Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.712451 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vx24\" (UniqueName: \"kubernetes.io/projected/a2a8ea5f-24b7-4ba0-a197-9f9700436a0e-kube-api-access-6vx24\") pod \"nmstate-operator-796d4cfff4-6h9tn\" (UID: \"a2a8ea5f-24b7-4ba0-a197-9f9700436a0e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.813605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx24\" (UniqueName: \"kubernetes.io/projected/a2a8ea5f-24b7-4ba0-a197-9f9700436a0e-kube-api-access-6vx24\") pod \"nmstate-operator-796d4cfff4-6h9tn\" (UID: \"a2a8ea5f-24b7-4ba0-a197-9f9700436a0e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.845665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx24\" (UniqueName: \"kubernetes.io/projected/a2a8ea5f-24b7-4ba0-a197-9f9700436a0e-kube-api-access-6vx24\") pod \"nmstate-operator-796d4cfff4-6h9tn\" (UID: \"a2a8ea5f-24b7-4ba0-a197-9f9700436a0e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" Mar 17 01:22:11 crc kubenswrapper[4735]: I0317 01:22:11.953354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" Mar 17 01:22:12 crc kubenswrapper[4735]: I0317 01:22:12.388513 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn"] Mar 17 01:22:12 crc kubenswrapper[4735]: I0317 01:22:12.464656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" event={"ID":"a2a8ea5f-24b7-4ba0-a197-9f9700436a0e","Type":"ContainerStarted","Data":"01c59d934b7f44938c34d2d2b2b5a052acbd1b75adb8ea3be2bf267172371be6"} Mar 17 01:22:12 crc kubenswrapper[4735]: I0317 01:22:12.606093 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:22:12 crc kubenswrapper[4735]: I0317 01:22:12.606350 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:22:15 crc kubenswrapper[4735]: I0317 01:22:15.485329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" event={"ID":"a2a8ea5f-24b7-4ba0-a197-9f9700436a0e","Type":"ContainerStarted","Data":"e295db21782dc7666486c9abc4552dcc9a74962fac8827230ce4372011d996bc"} Mar 17 01:22:15 crc kubenswrapper[4735]: I0317 01:22:15.518045 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6h9tn" podStartSLOduration=2.120912006 podStartE2EDuration="4.51801484s" podCreationTimestamp="2026-03-17 01:22:11 +0000 UTC" firstStartedPulling="2026-03-17 01:22:12.399712348 +0000 UTC m=+758.031945326" lastFinishedPulling="2026-03-17 01:22:14.796815182 +0000 UTC m=+760.429048160" observedRunningTime="2026-03-17 01:22:15.512527245 +0000 UTC m=+761.144760263" watchObservedRunningTime="2026-03-17 01:22:15.51801484 +0000 UTC m=+761.150247858" Mar 17 01:22:35 crc kubenswrapper[4735]: I0317 01:22:35.776366 4735 scope.go:117] "RemoveContainer" containerID="52e2d20b79c2019eec8fd538dfe4bdfeda4fc7eb6df5b8df4c53d7fe63de8b2a" Mar 17 01:22:37 crc kubenswrapper[4735]: I0317 01:22:37.989686 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds"] Mar 17 01:22:37 crc kubenswrapper[4735]: I0317 01:22:37.991107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" Mar 17 01:22:37 crc kubenswrapper[4735]: I0317 01:22:37.998160 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rbclm"] Mar 17 01:22:37 crc kubenswrapper[4735]: I0317 01:22:37.998784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:37 crc kubenswrapper[4735]: I0317 01:22:37.999353 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-6zztj" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.001911 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.039420 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qvt46"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.040052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.047338 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rbclm"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.074398 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.141764 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.142377 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.144330 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.144371 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lplfb" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.144391 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.153413 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cfds\" (UniqueName: \"kubernetes.io/projected/5f46807e-1877-476e-aeb0-7e1acfe206da-kube-api-access-5cfds\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-dbus-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-ovs-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154114 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9nw\" (UniqueName: \"kubernetes.io/projected/1a943747-e426-437c-a203-7d326d2b1cc1-kube-api-access-9k9nw\") pod \"nmstate-metrics-9b8c8685d-tl9ds\" (UID: \"1a943747-e426-437c-a203-7d326d2b1cc1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfnwr\" (UniqueName: \"kubernetes.io/projected/b4c72542-813d-4029-a6ef-f76feb3f6459-kube-api-access-sfnwr\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b4c72542-813d-4029-a6ef-f76feb3f6459-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.154183 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-nmstate-lock\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-ovs-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-ovs-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9nw\" (UniqueName: \"kubernetes.io/projected/1a943747-e426-437c-a203-7d326d2b1cc1-kube-api-access-9k9nw\") pod \"nmstate-metrics-9b8c8685d-tl9ds\" (UID: \"1a943747-e426-437c-a203-7d326d2b1cc1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255315 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cn2\" (UniqueName: \"kubernetes.io/projected/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-kube-api-access-s9cn2\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255396 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfnwr\" (UniqueName: \"kubernetes.io/projected/b4c72542-813d-4029-a6ef-f76feb3f6459-kube-api-access-sfnwr\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b4c72542-813d-4029-a6ef-f76feb3f6459-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-nmstate-lock\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255492 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.255646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-nmstate-lock\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.256228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cfds\" (UniqueName: \"kubernetes.io/projected/5f46807e-1877-476e-aeb0-7e1acfe206da-kube-api-access-5cfds\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.256257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-dbus-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.256436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f46807e-1877-476e-aeb0-7e1acfe206da-dbus-socket\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.261519 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b4c72542-813d-4029-a6ef-f76feb3f6459-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.278704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfnwr\" (UniqueName: \"kubernetes.io/projected/b4c72542-813d-4029-a6ef-f76feb3f6459-kube-api-access-sfnwr\") pod \"nmstate-webhook-5f558f5558-rbclm\" (UID: \"b4c72542-813d-4029-a6ef-f76feb3f6459\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.285066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9nw\" (UniqueName: \"kubernetes.io/projected/1a943747-e426-437c-a203-7d326d2b1cc1-kube-api-access-9k9nw\") pod \"nmstate-metrics-9b8c8685d-tl9ds\" (UID: \"1a943747-e426-437c-a203-7d326d2b1cc1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.311552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.321250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.335406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cfds\" (UniqueName: \"kubernetes.io/projected/5f46807e-1877-476e-aeb0-7e1acfe206da-kube-api-access-5cfds\") pod \"nmstate-handler-qvt46\" (UID: \"5f46807e-1877-476e-aeb0-7e1acfe206da\") " pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.356764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.356832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cn2\" (UniqueName: \"kubernetes.io/projected/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-kube-api-access-s9cn2\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.356870 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.357650 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.363361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.368263 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.379352 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c56d5bdb9-s7l5w"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.380014 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.382826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cn2\" (UniqueName: \"kubernetes.io/projected/dfa311dd-f771-40b1-9b71-0dd3f2e09ba6-kube-api-access-s9cn2\") pod \"nmstate-console-plugin-86f58fcf4-4jtvs\" (UID: \"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.403261 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c56d5bdb9-s7l5w"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.454110 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-oauth-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563283 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-oauth-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563369 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-service-ca\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-trusted-ca-bundle\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.563452 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6dc\" (UniqueName: \"kubernetes.io/projected/52fc4922-78c1-43d0-95e4-36a02a3db8c6-kube-api-access-6s6dc\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.644906 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvt46" event={"ID":"5f46807e-1877-476e-aeb0-7e1acfe206da","Type":"ContainerStarted","Data":"e54eef69f2aca595d3194092e462a7f3d0d527204ed13b63d7676894c5852e49"} Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664587 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-oauth-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-oauth-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-service-ca\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-trusted-ca-bundle\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.664745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6dc\" (UniqueName: \"kubernetes.io/projected/52fc4922-78c1-43d0-95e4-36a02a3db8c6-kube-api-access-6s6dc\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.666313 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-oauth-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.667463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-trusted-ca-bundle\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.667656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.667089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52fc4922-78c1-43d0-95e4-36a02a3db8c6-service-ca\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.672534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-oauth-config\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.674501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fc4922-78c1-43d0-95e4-36a02a3db8c6-console-serving-cert\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.679252 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6dc\" (UniqueName: \"kubernetes.io/projected/52fc4922-78c1-43d0-95e4-36a02a3db8c6-kube-api-access-6s6dc\") pod \"console-5c56d5bdb9-s7l5w\" (UID: \"52fc4922-78c1-43d0-95e4-36a02a3db8c6\") " pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.721627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.858398 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds"] Mar 17 01:22:38 crc kubenswrapper[4735]: W0317 01:22:38.863040 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a943747_e426_437c_a203_7d326d2b1cc1.slice/crio-251df2669aea28e7ed51d5434ce578bc877ce321766f33d3d3882a0c0c86e972 WatchSource:0}: Error finding container 251df2669aea28e7ed51d5434ce578bc877ce321766f33d3d3882a0c0c86e972: Status 404 returned error can't find the container with id 251df2669aea28e7ed51d5434ce578bc877ce321766f33d3d3882a0c0c86e972 Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.875530 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c56d5bdb9-s7l5w"] Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.896761 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rbclm"] Mar 17 01:22:38 crc kubenswrapper[4735]: W0317 01:22:38.898866 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c72542_813d_4029_a6ef_f76feb3f6459.slice/crio-4491852b784dd433349c3c96cb7553c70227880ba2154baf6907fb7b5dd4c69f WatchSource:0}: Error finding container 4491852b784dd433349c3c96cb7553c70227880ba2154baf6907fb7b5dd4c69f: Status 404 returned error can't find the container with id 4491852b784dd433349c3c96cb7553c70227880ba2154baf6907fb7b5dd4c69f Mar 17 01:22:38 crc kubenswrapper[4735]: I0317 01:22:38.929954 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs"] Mar 17 01:22:38 crc kubenswrapper[4735]: W0317 01:22:38.938167 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa311dd_f771_40b1_9b71_0dd3f2e09ba6.slice/crio-11032063c0d24d0c094181f85d1aed408f7bc653eacd105df4d3116c045f12d2 WatchSource:0}: Error finding container 11032063c0d24d0c094181f85d1aed408f7bc653eacd105df4d3116c045f12d2: Status 404 returned error can't find the container with id 11032063c0d24d0c094181f85d1aed408f7bc653eacd105df4d3116c045f12d2 Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.655275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c56d5bdb9-s7l5w" event={"ID":"52fc4922-78c1-43d0-95e4-36a02a3db8c6","Type":"ContainerStarted","Data":"f10a87d70e01a17ee3385dcc53d353de4130a6700ff2f83a36e38aaeb796814f"} Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.655756 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c56d5bdb9-s7l5w" event={"ID":"52fc4922-78c1-43d0-95e4-36a02a3db8c6","Type":"ContainerStarted","Data":"18ca4189c6eb638762012d01e41489c00fd4906ef1016c34aeaeac3bcc366770"} Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.658321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" event={"ID":"b4c72542-813d-4029-a6ef-f76feb3f6459","Type":"ContainerStarted","Data":"4491852b784dd433349c3c96cb7553c70227880ba2154baf6907fb7b5dd4c69f"} Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.661249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" event={"ID":"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6","Type":"ContainerStarted","Data":"11032063c0d24d0c094181f85d1aed408f7bc653eacd105df4d3116c045f12d2"} Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.663360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" event={"ID":"1a943747-e426-437c-a203-7d326d2b1cc1","Type":"ContainerStarted","Data":"251df2669aea28e7ed51d5434ce578bc877ce321766f33d3d3882a0c0c86e972"} Mar 17 01:22:39 crc kubenswrapper[4735]: I0317 01:22:39.687750 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c56d5bdb9-s7l5w" podStartSLOduration=1.687720297 podStartE2EDuration="1.687720297s" podCreationTimestamp="2026-03-17 01:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:22:39.685556557 +0000 UTC m=+785.317789575" watchObservedRunningTime="2026-03-17 01:22:39.687720297 +0000 UTC m=+785.319953305" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.606244 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.606726 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.697554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" event={"ID":"dfa311dd-f771-40b1-9b71-0dd3f2e09ba6","Type":"ContainerStarted","Data":"8fd782b8c93341cf618f95d79408d0bbb5911a5a7064732d32361ca760310b70"} Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.701050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" event={"ID":"1a943747-e426-437c-a203-7d326d2b1cc1","Type":"ContainerStarted","Data":"ec4370b5e395a1e0daf83919cae811fa81e8e0a34661946cc111356e8c028dc5"} Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.706280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvt46" event={"ID":"5f46807e-1877-476e-aeb0-7e1acfe206da","Type":"ContainerStarted","Data":"9455fc9bb4b49a0da3a99a6c33716dca52e5aaf57d38728ab3d685555f7bab2a"} Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.707091 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.709422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" event={"ID":"b4c72542-813d-4029-a6ef-f76feb3f6459","Type":"ContainerStarted","Data":"0a6b92ff2e97363b6972542d6736f34544e61013865acc00639fb7ef7b28b7fa"} Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.709807 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.735369 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4jtvs" podStartSLOduration=1.7167584900000001 podStartE2EDuration="4.735345911s" podCreationTimestamp="2026-03-17 01:22:38 +0000 UTC" firstStartedPulling="2026-03-17 01:22:38.945136989 +0000 UTC m=+784.577369967" lastFinishedPulling="2026-03-17 01:22:41.96372441 +0000 UTC m=+787.595957388" observedRunningTime="2026-03-17 01:22:42.725711562 +0000 UTC m=+788.357944570" watchObservedRunningTime="2026-03-17 01:22:42.735345911 +0000 UTC m=+788.367578899" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.765089 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" podStartSLOduration=2.696714543 podStartE2EDuration="5.765059672s" podCreationTimestamp="2026-03-17 01:22:37 +0000 UTC" firstStartedPulling="2026-03-17 01:22:38.900847747 +0000 UTC m=+784.533080725" lastFinishedPulling="2026-03-17 01:22:41.969192846 +0000 UTC m=+787.601425854" observedRunningTime="2026-03-17 01:22:42.757742554 +0000 UTC m=+788.389975562" watchObservedRunningTime="2026-03-17 01:22:42.765059672 +0000 UTC m=+788.397292690" Mar 17 01:22:42 crc kubenswrapper[4735]: I0317 01:22:42.789057 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qvt46" podStartSLOduration=1.2710969570000001 podStartE2EDuration="4.78903133s" podCreationTimestamp="2026-03-17 01:22:38 +0000 UTC" firstStartedPulling="2026-03-17 01:22:38.45113941 +0000 UTC m=+784.083372388" lastFinishedPulling="2026-03-17 01:22:41.969073773 +0000 UTC m=+787.601306761" observedRunningTime="2026-03-17 01:22:42.784606429 +0000 UTC m=+788.416839417" watchObservedRunningTime="2026-03-17 01:22:42.78903133 +0000 UTC m=+788.421264348" Mar 17 01:22:45 crc kubenswrapper[4735]: I0317 01:22:45.760933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" event={"ID":"1a943747-e426-437c-a203-7d326d2b1cc1","Type":"ContainerStarted","Data":"1777d6f70cbc4c6a3f2a05891840557a3489f19260178e1b93a9772f14e55dd0"} Mar 17 01:22:45 crc kubenswrapper[4735]: I0317 01:22:45.801348 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tl9ds" podStartSLOduration=2.7674629 podStartE2EDuration="8.801311977s" podCreationTimestamp="2026-03-17 01:22:37 +0000 UTC" firstStartedPulling="2026-03-17 01:22:38.865071688 +0000 UTC m=+784.497304676" lastFinishedPulling="2026-03-17 01:22:44.898920735 +0000 UTC m=+790.531153753" observedRunningTime="2026-03-17 01:22:45.793119449 +0000 UTC m=+791.425352507" watchObservedRunningTime="2026-03-17 01:22:45.801311977 +0000 UTC m=+791.433544995" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.413717 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qvt46" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.721789 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.722174 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.762790 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.792748 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c56d5bdb9-s7l5w" Mar 17 01:22:48 crc kubenswrapper[4735]: I0317 01:22:48.887448 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:22:58 crc kubenswrapper[4735]: I0317 01:22:58.329885 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rbclm" Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.607376 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.610075 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.610181 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.611254 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.611362 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a" gracePeriod=600 Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.985307 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a" exitCode=0 Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.985573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a"} Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.985707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110"} Mar 17 01:23:12 crc kubenswrapper[4735]: I0317 01:23:12.985731 4735 scope.go:117] "RemoveContainer" containerID="991b107d6d47eeaa11d2f6b7ad11162b455fb9438638f292ae0c050ecb047ad9" Mar 17 01:23:13 crc kubenswrapper[4735]: I0317 01:23:13.854601 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg"] Mar 17 01:23:13 crc kubenswrapper[4735]: I0317 01:23:13.856915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:13 crc kubenswrapper[4735]: I0317 01:23:13.858898 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 01:23:13 crc kubenswrapper[4735]: I0317 01:23:13.901471 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg"] Mar 17 01:23:13 crc kubenswrapper[4735]: I0317 01:23:13.935808 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nh28b" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" containerID="cri-o://12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9" gracePeriod=15 Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.003013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phm86\" (UniqueName: \"kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.003363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.003411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.104489 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phm86\" (UniqueName: \"kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.104586 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.104708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.105697 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.106012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.132673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phm86\" (UniqueName: \"kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.174602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.296911 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nh28b_68d08823-e5f7-48eb-898e-3e59c772c8e9/console/0.log" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.297261 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.391877 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg"] Mar 17 01:23:14 crc kubenswrapper[4735]: W0317 01:23:14.397884 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0538ead5_a867_488c_819e_8ea63b6ad7ff.slice/crio-0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5 WatchSource:0}: Error finding container 0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5: Status 404 returned error can't find the container with id 0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5 Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408144 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408887 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca" (OuterVolumeSpecName: "service-ca") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.408914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qvqn\" (UniqueName: \"kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409042 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409063 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config\") pod \"68d08823-e5f7-48eb-898e-3e59c772c8e9\" (UID: \"68d08823-e5f7-48eb-898e-3e59c772c8e9\") " Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409241 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409252 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409430 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.409564 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config" (OuterVolumeSpecName: "console-config") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.413560 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.413750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn" (OuterVolumeSpecName: "kube-api-access-8qvqn") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "kube-api-access-8qvqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.413779 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "68d08823-e5f7-48eb-898e-3e59c772c8e9" (UID: "68d08823-e5f7-48eb-898e-3e59c772c8e9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.510400 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.510589 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.510599 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qvqn\" (UniqueName: \"kubernetes.io/projected/68d08823-e5f7-48eb-898e-3e59c772c8e9-kube-api-access-8qvqn\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.510608 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4735]: I0317 01:23:14.510616 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68d08823-e5f7-48eb-898e-3e59c772c8e9-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.009549 4735 generic.go:334] "Generic (PLEG): container finished" podID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerID="d6e61ea2ae964d4561513006752949954f07565c49edf4a0e9fd83998e8ac9ee" exitCode=0 Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.009684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" event={"ID":"0538ead5-a867-488c-819e-8ea63b6ad7ff","Type":"ContainerDied","Data":"d6e61ea2ae964d4561513006752949954f07565c49edf4a0e9fd83998e8ac9ee"} Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.009725 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" event={"ID":"0538ead5-a867-488c-819e-8ea63b6ad7ff","Type":"ContainerStarted","Data":"0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5"} Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013197 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nh28b_68d08823-e5f7-48eb-898e-3e59c772c8e9/console/0.log" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013254 4735 generic.go:334] "Generic (PLEG): container finished" podID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerID="12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9" exitCode=2 Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nh28b" event={"ID":"68d08823-e5f7-48eb-898e-3e59c772c8e9","Type":"ContainerDied","Data":"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9"} Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013332 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nh28b" event={"ID":"68d08823-e5f7-48eb-898e-3e59c772c8e9","Type":"ContainerDied","Data":"9a00c32f856ffafffc0a65b10abe8d6f626d28422375f35c3f17337ef3cc4e80"} Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013366 4735 scope.go:117] "RemoveContainer" containerID="12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.013489 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nh28b" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.063060 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.066801 4735 scope.go:117] "RemoveContainer" containerID="12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9" Mar 17 01:23:15 crc kubenswrapper[4735]: E0317 01:23:15.067508 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9\": container with ID starting with 12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9 not found: ID does not exist" containerID="12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.067567 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9"} err="failed to get container status \"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9\": rpc error: code = NotFound desc = could not find container \"12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9\": container with ID starting with 12740bfb992877e1bd95d392c1848edde0bc0527808f365cdafc26f4ba946fd9 not found: ID does not exist" Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.068193 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nh28b"] Mar 17 01:23:15 crc kubenswrapper[4735]: I0317 01:23:15.090361 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" path="/var/lib/kubelet/pods/68d08823-e5f7-48eb-898e-3e59c772c8e9/volumes" Mar 17 01:23:17 crc kubenswrapper[4735]: I0317 01:23:17.037935 4735 generic.go:334] "Generic (PLEG): container finished" podID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerID="a8f6ae72911e337a905791a958680da3c12f52ed1122f73186379f285ce99c6a" exitCode=0 Mar 17 01:23:17 crc kubenswrapper[4735]: I0317 01:23:17.038255 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" event={"ID":"0538ead5-a867-488c-819e-8ea63b6ad7ff","Type":"ContainerDied","Data":"a8f6ae72911e337a905791a958680da3c12f52ed1122f73186379f285ce99c6a"} Mar 17 01:23:18 crc kubenswrapper[4735]: I0317 01:23:18.048900 4735 generic.go:334] "Generic (PLEG): container finished" podID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerID="2f420ee5938c342bcaa037ff6e0af24931390d77f1f08455343cf801246bb8cf" exitCode=0 Mar 17 01:23:18 crc kubenswrapper[4735]: I0317 01:23:18.049014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" event={"ID":"0538ead5-a867-488c-819e-8ea63b6ad7ff","Type":"ContainerDied","Data":"2f420ee5938c342bcaa037ff6e0af24931390d77f1f08455343cf801246bb8cf"} Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.386906 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.583514 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle\") pod \"0538ead5-a867-488c-819e-8ea63b6ad7ff\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.583983 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util\") pod \"0538ead5-a867-488c-819e-8ea63b6ad7ff\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.584022 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phm86\" (UniqueName: \"kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86\") pod \"0538ead5-a867-488c-819e-8ea63b6ad7ff\" (UID: \"0538ead5-a867-488c-819e-8ea63b6ad7ff\") " Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.584702 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle" (OuterVolumeSpecName: "bundle") pod "0538ead5-a867-488c-819e-8ea63b6ad7ff" (UID: "0538ead5-a867-488c-819e-8ea63b6ad7ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.592030 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86" (OuterVolumeSpecName: "kube-api-access-phm86") pod "0538ead5-a867-488c-819e-8ea63b6ad7ff" (UID: "0538ead5-a867-488c-819e-8ea63b6ad7ff"). InnerVolumeSpecName "kube-api-access-phm86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.606002 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util" (OuterVolumeSpecName: "util") pod "0538ead5-a867-488c-819e-8ea63b6ad7ff" (UID: "0538ead5-a867-488c-819e-8ea63b6ad7ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.685980 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.686016 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0538ead5-a867-488c-819e-8ea63b6ad7ff-util\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:19 crc kubenswrapper[4735]: I0317 01:23:19.686030 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phm86\" (UniqueName: \"kubernetes.io/projected/0538ead5-a867-488c-819e-8ea63b6ad7ff-kube-api-access-phm86\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:20 crc kubenswrapper[4735]: I0317 01:23:20.066256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" event={"ID":"0538ead5-a867-488c-819e-8ea63b6ad7ff","Type":"ContainerDied","Data":"0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5"} Mar 17 01:23:20 crc kubenswrapper[4735]: I0317 01:23:20.066303 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg" Mar 17 01:23:20 crc kubenswrapper[4735]: I0317 01:23:20.066322 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a6194d9766b7cb2c5d6a172b61361973a88f13e1665fc57349a4c1d2883eec5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530008 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5"] Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.530717 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530729 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.530745 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="extract" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530753 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="extract" Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.530764 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="pull" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530774 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="pull" Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.530784 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="util" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530791 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="util" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530924 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0538ead5-a867-488c-819e-8ea63b6ad7ff" containerName="extract" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.530937 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d08823-e5f7-48eb-898e-3e59c772c8e9" containerName="console" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.531297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.538714 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.538763 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.541830 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kjgw6" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.542321 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.543479 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.550473 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5"] Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.597065 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dw7\" (UniqueName: \"kubernetes.io/projected/81e7afb5-02be-49b0-bd12-39b2b2346a93-kube-api-access-s2dw7\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.597123 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-webhook-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.597152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-apiservice-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.698196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dw7\" (UniqueName: \"kubernetes.io/projected/81e7afb5-02be-49b0-bd12-39b2b2346a93-kube-api-access-s2dw7\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.698531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-webhook-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.698557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-apiservice-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.703877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-apiservice-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.713590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81e7afb5-02be-49b0-bd12-39b2b2346a93-webhook-cert\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.738229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dw7\" (UniqueName: \"kubernetes.io/projected/81e7afb5-02be-49b0-bd12-39b2b2346a93-kube-api-access-s2dw7\") pod \"metallb-operator-controller-manager-5bffff7ccd-ss6s5\" (UID: \"81e7afb5-02be-49b0-bd12-39b2b2346a93\") " pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.844388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.906763 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst"] Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.907409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:28 crc kubenswrapper[4735]: W0317 01:23:28.909433 4735 reflector.go:561] object-"metallb-system"/"controller-dockercfg-42vkm": failed to list *v1.Secret: secrets "controller-dockercfg-42vkm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.909463 4735 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-42vkm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-42vkm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:23:28 crc kubenswrapper[4735]: W0317 01:23:28.909510 4735 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.909520 4735 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:23:28 crc kubenswrapper[4735]: W0317 01:23:28.909615 4735 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 17 01:23:28 crc kubenswrapper[4735]: E0317 01:23:28.909632 4735 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 17 01:23:28 crc kubenswrapper[4735]: I0317 01:23:28.922487 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst"] Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.107624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-webhook-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.107908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zgz\" (UniqueName: \"kubernetes.io/projected/1841e816-4298-4c01-8bfa-07273ea8dfff-kube-api-access-r7zgz\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.107955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.208800 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.208895 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-webhook-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.208916 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zgz\" (UniqueName: \"kubernetes.io/projected/1841e816-4298-4c01-8bfa-07273ea8dfff-kube-api-access-r7zgz\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.230435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zgz\" (UniqueName: \"kubernetes.io/projected/1841e816-4298-4c01-8bfa-07273ea8dfff-kube-api-access-r7zgz\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:29 crc kubenswrapper[4735]: I0317 01:23:29.364967 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5"] Mar 17 01:23:29 crc kubenswrapper[4735]: W0317 01:23:29.370975 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e7afb5_02be_49b0_bd12_39b2b2346a93.slice/crio-799bb3a7e91c1678ddb8594a5a40ffea8c46a21e16ada8b066084d3bd2633fb4 WatchSource:0}: Error finding container 799bb3a7e91c1678ddb8594a5a40ffea8c46a21e16ada8b066084d3bd2633fb4: Status 404 returned error can't find the container with id 799bb3a7e91c1678ddb8594a5a40ffea8c46a21e16ada8b066084d3bd2633fb4 Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.152886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" event={"ID":"81e7afb5-02be-49b0-bd12-39b2b2346a93","Type":"ContainerStarted","Data":"799bb3a7e91c1678ddb8594a5a40ffea8c46a21e16ada8b066084d3bd2633fb4"} Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.159116 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.163814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.164335 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1841e816-4298-4c01-8bfa-07273ea8dfff-webhook-cert\") pod \"metallb-operator-webhook-server-7b5d585c89-bjzst\" (UID: \"1841e816-4298-4c01-8bfa-07273ea8dfff\") " pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.265429 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.484212 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-42vkm" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.490958 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:30 crc kubenswrapper[4735]: I0317 01:23:30.753647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst"] Mar 17 01:23:30 crc kubenswrapper[4735]: W0317 01:23:30.758551 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1841e816_4298_4c01_8bfa_07273ea8dfff.slice/crio-c5ca608cf0df6d9c281c9300dccc3af24cc26044933f7d030ef66828ffa13b24 WatchSource:0}: Error finding container c5ca608cf0df6d9c281c9300dccc3af24cc26044933f7d030ef66828ffa13b24: Status 404 returned error can't find the container with id c5ca608cf0df6d9c281c9300dccc3af24cc26044933f7d030ef66828ffa13b24 Mar 17 01:23:31 crc kubenswrapper[4735]: I0317 01:23:31.162283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" event={"ID":"1841e816-4298-4c01-8bfa-07273ea8dfff","Type":"ContainerStarted","Data":"c5ca608cf0df6d9c281c9300dccc3af24cc26044933f7d030ef66828ffa13b24"} Mar 17 01:23:31 crc kubenswrapper[4735]: I0317 01:23:31.244474 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 01:23:33 crc kubenswrapper[4735]: I0317 01:23:33.176815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" event={"ID":"81e7afb5-02be-49b0-bd12-39b2b2346a93","Type":"ContainerStarted","Data":"bcf049969a098450cb764333cf794be0a34a4b266fdf569497f5ea09016a5a92"} Mar 17 01:23:33 crc kubenswrapper[4735]: I0317 01:23:33.177245 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:23:33 crc kubenswrapper[4735]: I0317 01:23:33.196519 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" podStartSLOduration=1.8258748420000002 podStartE2EDuration="5.196502388s" podCreationTimestamp="2026-03-17 01:23:28 +0000 UTC" firstStartedPulling="2026-03-17 01:23:29.373332538 +0000 UTC m=+835.005565516" lastFinishedPulling="2026-03-17 01:23:32.743960084 +0000 UTC m=+838.376193062" observedRunningTime="2026-03-17 01:23:33.194989139 +0000 UTC m=+838.827222117" watchObservedRunningTime="2026-03-17 01:23:33.196502388 +0000 UTC m=+838.828735366" Mar 17 01:23:37 crc kubenswrapper[4735]: I0317 01:23:37.204001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" event={"ID":"1841e816-4298-4c01-8bfa-07273ea8dfff","Type":"ContainerStarted","Data":"7f75587cde59754fc379f22f1fdf74b4255e1fd569a692b140fd86b842e1f6fb"} Mar 17 01:23:37 crc kubenswrapper[4735]: I0317 01:23:37.204627 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:23:37 crc kubenswrapper[4735]: I0317 01:23:37.241817 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" podStartSLOduration=3.573949595 podStartE2EDuration="9.241792141s" podCreationTimestamp="2026-03-17 01:23:28 +0000 UTC" firstStartedPulling="2026-03-17 01:23:30.760041007 +0000 UTC m=+836.392273985" lastFinishedPulling="2026-03-17 01:23:36.427883553 +0000 UTC m=+842.060116531" observedRunningTime="2026-03-17 01:23:37.234134781 +0000 UTC m=+842.866367769" watchObservedRunningTime="2026-03-17 01:23:37.241792141 +0000 UTC m=+842.874025159" Mar 17 01:23:50 crc kubenswrapper[4735]: I0317 01:23:50.497694 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b5d585c89-bjzst" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.147621 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561844-5mvsn"] Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.150644 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.153443 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.154181 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.162694 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-5mvsn"] Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.166055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.298601 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9m69\" (UniqueName: \"kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69\") pod \"auto-csr-approver-29561844-5mvsn\" (UID: \"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd\") " pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.399638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9m69\" (UniqueName: \"kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69\") pod \"auto-csr-approver-29561844-5mvsn\" (UID: \"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd\") " pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.429284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9m69\" (UniqueName: \"kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69\") pod \"auto-csr-approver-29561844-5mvsn\" (UID: \"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd\") " pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.475300 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:00 crc kubenswrapper[4735]: I0317 01:24:00.940209 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-5mvsn"] Mar 17 01:24:00 crc kubenswrapper[4735]: W0317 01:24:00.959861 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40cbfa1_7f4e_4e47_a8a6_444a4c087efd.slice/crio-fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b WatchSource:0}: Error finding container fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b: Status 404 returned error can't find the container with id fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b Mar 17 01:24:01 crc kubenswrapper[4735]: I0317 01:24:01.344016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" event={"ID":"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd","Type":"ContainerStarted","Data":"fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b"} Mar 17 01:24:02 crc kubenswrapper[4735]: I0317 01:24:02.349213 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" event={"ID":"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd","Type":"ContainerStarted","Data":"356d6b57d9e05fdbec20214f9cf284404c8599bde4ae456a5a47df63588eba71"} Mar 17 01:24:02 crc kubenswrapper[4735]: I0317 01:24:02.360770 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" podStartSLOduration=1.246362084 podStartE2EDuration="2.360755354s" podCreationTimestamp="2026-03-17 01:24:00 +0000 UTC" firstStartedPulling="2026-03-17 01:24:00.962306016 +0000 UTC m=+866.594539004" lastFinishedPulling="2026-03-17 01:24:02.076699296 +0000 UTC m=+867.708932274" observedRunningTime="2026-03-17 01:24:02.35777732 +0000 UTC m=+867.990010298" watchObservedRunningTime="2026-03-17 01:24:02.360755354 +0000 UTC m=+867.992988332" Mar 17 01:24:03 crc kubenswrapper[4735]: I0317 01:24:03.381534 4735 generic.go:334] "Generic (PLEG): container finished" podID="c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" containerID="356d6b57d9e05fdbec20214f9cf284404c8599bde4ae456a5a47df63588eba71" exitCode=0 Mar 17 01:24:03 crc kubenswrapper[4735]: I0317 01:24:03.381585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" event={"ID":"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd","Type":"ContainerDied","Data":"356d6b57d9e05fdbec20214f9cf284404c8599bde4ae456a5a47df63588eba71"} Mar 17 01:24:04 crc kubenswrapper[4735]: I0317 01:24:04.773330 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:04 crc kubenswrapper[4735]: I0317 01:24:04.970902 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9m69\" (UniqueName: \"kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69\") pod \"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd\" (UID: \"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd\") " Mar 17 01:24:04 crc kubenswrapper[4735]: I0317 01:24:04.988357 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69" (OuterVolumeSpecName: "kube-api-access-k9m69") pod "c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" (UID: "c40cbfa1-7f4e-4e47-a8a6-444a4c087efd"). InnerVolumeSpecName "kube-api-access-k9m69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.073184 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9m69\" (UniqueName: \"kubernetes.io/projected/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd-kube-api-access-k9m69\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.400730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" event={"ID":"c40cbfa1-7f4e-4e47-a8a6-444a4c087efd","Type":"ContainerDied","Data":"fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b"} Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.401182 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9fd2e966082193489d6e3197eb2c6d11e808c871ac6036178199d0196e1e5b" Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.400791 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-5mvsn" Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.442667 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-298bz"] Mar 17 01:24:05 crc kubenswrapper[4735]: I0317 01:24:05.447296 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-298bz"] Mar 17 01:24:07 crc kubenswrapper[4735]: I0317 01:24:07.097986 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176" path="/var/lib/kubelet/pods/4ea7a7ff-48b9-4f8c-8ecc-7d7fd1482176/volumes" Mar 17 01:24:08 crc kubenswrapper[4735]: I0317 01:24:08.847423 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5bffff7ccd-ss6s5" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.653344 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pfcw2"] Mar 17 01:24:09 crc kubenswrapper[4735]: E0317 01:24:09.653603 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" containerName="oc" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.653793 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" containerName="oc" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.654010 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" containerName="oc" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.656413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.657956 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.659441 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tswxt" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.659754 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.660846 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b"] Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.661685 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.664065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.678089 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b"] Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.764903 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-slwx6"] Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.765782 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-slwx6" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.768129 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-788lk"] Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.768727 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.768807 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.768853 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.769004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.769601 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dhhv4" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.771943 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.787504 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-788lk"] Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.834891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-sockets\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.834941 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.834966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-conf\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835803 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics-certs\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-startup\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835896 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-reloader\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px28w\" (UniqueName: \"kubernetes.io/projected/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-kube-api-access-px28w\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.835939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctwx\" (UniqueName: \"kubernetes.io/projected/899c4d8c-fe75-4189-af67-c3edbd89d3fc-kube-api-access-wctwx\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-cert\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-sockets\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-sockets\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.937809 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-conf\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938136 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics-certs\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-startup\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-reloader\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938239 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-conf\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/899c4d8c-fe75-4189-af67-c3edbd89d3fc-reloader\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938908 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/899c4d8c-fe75-4189-af67-c3edbd89d3fc-frr-startup\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938942 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px28w\" (UniqueName: \"kubernetes.io/projected/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-kube-api-access-px28w\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.938983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctwx\" (UniqueName: \"kubernetes.io/projected/899c4d8c-fe75-4189-af67-c3edbd89d3fc-kube-api-access-wctwx\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.939003 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5r9q\" (UniqueName: \"kubernetes.io/projected/1edf1039-ebea-4804-9f30-6844633b7919-kube-api-access-b5r9q\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.940340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.940372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7w5\" (UniqueName: \"kubernetes.io/projected/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-kube-api-access-lb7w5\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.940394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metallb-excludel2\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.946372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.955247 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/899c4d8c-fe75-4189-af67-c3edbd89d3fc-metrics-certs\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.971191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px28w\" (UniqueName: \"kubernetes.io/projected/665a221c-0d9f-4dfd-888a-fc7d5f09fbdb-kube-api-access-px28w\") pod \"frr-k8s-webhook-server-bcc4b6f68-c667b\" (UID: \"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.975437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctwx\" (UniqueName: \"kubernetes.io/projected/899c4d8c-fe75-4189-af67-c3edbd89d3fc-kube-api-access-wctwx\") pod \"frr-k8s-pfcw2\" (UID: \"899c4d8c-fe75-4189-af67-c3edbd89d3fc\") " pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.983097 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:09 crc kubenswrapper[4735]: I0317 01:24:09.987094 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.040844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.040922 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5r9q\" (UniqueName: \"kubernetes.io/projected/1edf1039-ebea-4804-9f30-6844633b7919-kube-api-access-b5r9q\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.040945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.040964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7w5\" (UniqueName: \"kubernetes.io/projected/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-kube-api-access-lb7w5\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.040984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metallb-excludel2\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.041004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-cert\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.041026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041126 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041171 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist podName:3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e nodeName:}" failed. No retries permitted until 2026-03-17 01:24:10.541155036 +0000 UTC m=+876.173388014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist") pod "speaker-slwx6" (UID: "3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e") : secret "metallb-memberlist" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041385 4735 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041413 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs podName:1edf1039-ebea-4804-9f30-6844633b7919 nodeName:}" failed. No retries permitted until 2026-03-17 01:24:10.541406802 +0000 UTC m=+876.173639780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs") pod "controller-7bb4cc7c98-788lk" (UID: "1edf1039-ebea-4804-9f30-6844633b7919") : secret "controller-certs-secret" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041574 4735 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.041600 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs podName:3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e nodeName:}" failed. No retries permitted until 2026-03-17 01:24:10.541593686 +0000 UTC m=+876.173826664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs") pod "speaker-slwx6" (UID: "3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e") : secret "speaker-certs-secret" not found Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.042236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metallb-excludel2\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.044247 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.056337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-cert\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.083512 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7w5\" (UniqueName: \"kubernetes.io/projected/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-kube-api-access-lb7w5\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.098586 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5r9q\" (UniqueName: \"kubernetes.io/projected/1edf1039-ebea-4804-9f30-6844633b7919-kube-api-access-b5r9q\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.302105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b"] Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.434560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"25d1f232767b58a1e3b1dd4a7685fb3ca239fac42bfdde07a1816139f8714a77"} Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.435432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" event={"ID":"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb","Type":"ContainerStarted","Data":"ab4faf82fe33c2809506158a5f7e139153bb12c4abc2c8dc9e3bd7166e693e36"} Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.555593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.555690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.555761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.560074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1edf1039-ebea-4804-9f30-6844633b7919-metrics-certs\") pod \"controller-7bb4cc7c98-788lk\" (UID: \"1edf1039-ebea-4804-9f30-6844633b7919\") " pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.562532 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-metrics-certs\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.562634 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 01:24:10 crc kubenswrapper[4735]: E0317 01:24:10.562699 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist podName:3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e nodeName:}" failed. No retries permitted until 2026-03-17 01:24:11.562676887 +0000 UTC m=+877.194909875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist") pod "speaker-slwx6" (UID: "3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e") : secret "metallb-memberlist" not found Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.695253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:10 crc kubenswrapper[4735]: I0317 01:24:10.938064 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-788lk"] Mar 17 01:24:10 crc kubenswrapper[4735]: W0317 01:24:10.940647 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edf1039_ebea_4804_9f30_6844633b7919.slice/crio-fc200c65d4a1bc099e4f306dc09aea4677867fcfcafdde7405df484e570d4a99 WatchSource:0}: Error finding container fc200c65d4a1bc099e4f306dc09aea4677867fcfcafdde7405df484e570d4a99: Status 404 returned error can't find the container with id fc200c65d4a1bc099e4f306dc09aea4677867fcfcafdde7405df484e570d4a99 Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.443384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-788lk" event={"ID":"1edf1039-ebea-4804-9f30-6844633b7919","Type":"ContainerStarted","Data":"6acfdc81a3ed47f4e017d66daeec741aa71b7461377b64a7d502c08eee3863f4"} Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.443442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-788lk" event={"ID":"1edf1039-ebea-4804-9f30-6844633b7919","Type":"ContainerStarted","Data":"88baa4bd7024f6514041c33633507db919bd57c3b163ed8c07415b6c88bf3538"} Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.443455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-788lk" event={"ID":"1edf1039-ebea-4804-9f30-6844633b7919","Type":"ContainerStarted","Data":"fc200c65d4a1bc099e4f306dc09aea4677867fcfcafdde7405df484e570d4a99"} Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.466513 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-788lk" podStartSLOduration=2.466497258 podStartE2EDuration="2.466497258s" podCreationTimestamp="2026-03-17 01:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:24:11.465339959 +0000 UTC m=+877.097572947" watchObservedRunningTime="2026-03-17 01:24:11.466497258 +0000 UTC m=+877.098730236" Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.569711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.585023 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e-memberlist\") pod \"speaker-slwx6\" (UID: \"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e\") " pod="metallb-system/speaker-slwx6" Mar 17 01:24:11 crc kubenswrapper[4735]: I0317 01:24:11.880037 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-slwx6" Mar 17 01:24:12 crc kubenswrapper[4735]: I0317 01:24:12.457571 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-slwx6" event={"ID":"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e","Type":"ContainerStarted","Data":"4faf2665b96df2891c4df83b0f7b910785b24c37c0c71f958a8760436116ecfa"} Mar 17 01:24:12 crc kubenswrapper[4735]: I0317 01:24:12.457607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-slwx6" event={"ID":"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e","Type":"ContainerStarted","Data":"5d92a72035abd99916c17c467cfe4aeb56d0a0cbfafca2881336707c5d63b62e"} Mar 17 01:24:12 crc kubenswrapper[4735]: I0317 01:24:12.457696 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:13 crc kubenswrapper[4735]: I0317 01:24:13.475127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-slwx6" event={"ID":"3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e","Type":"ContainerStarted","Data":"6dfae507b6120e61c05bbd21a575bf55f6d874fcf00ff523ffa99aedb449fe31"} Mar 17 01:24:14 crc kubenswrapper[4735]: I0317 01:24:14.483502 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-slwx6" Mar 17 01:24:15 crc kubenswrapper[4735]: I0317 01:24:15.096799 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-slwx6" podStartSLOduration=6.09677694 podStartE2EDuration="6.09677694s" podCreationTimestamp="2026-03-17 01:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:24:13.502890618 +0000 UTC m=+879.135123596" watchObservedRunningTime="2026-03-17 01:24:15.09677694 +0000 UTC m=+880.729009918" Mar 17 01:24:18 crc kubenswrapper[4735]: I0317 01:24:18.511982 4735 generic.go:334] "Generic (PLEG): container finished" podID="899c4d8c-fe75-4189-af67-c3edbd89d3fc" containerID="13337421e446171be07a9978c447abf389d3f815e262e8ecc89dcc9a4b43c18b" exitCode=0 Mar 17 01:24:18 crc kubenswrapper[4735]: I0317 01:24:18.512601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerDied","Data":"13337421e446171be07a9978c447abf389d3f815e262e8ecc89dcc9a4b43c18b"} Mar 17 01:24:18 crc kubenswrapper[4735]: I0317 01:24:18.515157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" event={"ID":"665a221c-0d9f-4dfd-888a-fc7d5f09fbdb","Type":"ContainerStarted","Data":"88fb01f1b9e40f6ff94b24bfa5e4262d3a05c765c8435eed6e5b1118f70be39b"} Mar 17 01:24:18 crc kubenswrapper[4735]: I0317 01:24:18.515938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:19 crc kubenswrapper[4735]: I0317 01:24:19.528383 4735 generic.go:334] "Generic (PLEG): container finished" podID="899c4d8c-fe75-4189-af67-c3edbd89d3fc" containerID="f91fb0cea88d7e2c729310000c33ccf09f370f398a92eb32bdf98bc98dc25c17" exitCode=0 Mar 17 01:24:19 crc kubenswrapper[4735]: I0317 01:24:19.528448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerDied","Data":"f91fb0cea88d7e2c729310000c33ccf09f370f398a92eb32bdf98bc98dc25c17"} Mar 17 01:24:19 crc kubenswrapper[4735]: I0317 01:24:19.574824 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" podStartSLOduration=3.007621875 podStartE2EDuration="10.574794405s" podCreationTimestamp="2026-03-17 01:24:09 +0000 UTC" firstStartedPulling="2026-03-17 01:24:10.307129165 +0000 UTC m=+875.939362143" lastFinishedPulling="2026-03-17 01:24:17.874301655 +0000 UTC m=+883.506534673" observedRunningTime="2026-03-17 01:24:18.568583912 +0000 UTC m=+884.200816910" watchObservedRunningTime="2026-03-17 01:24:19.574794405 +0000 UTC m=+885.207027433" Mar 17 01:24:20 crc kubenswrapper[4735]: I0317 01:24:20.570254 4735 generic.go:334] "Generic (PLEG): container finished" podID="899c4d8c-fe75-4189-af67-c3edbd89d3fc" containerID="416df46cf541571a9de20ee02ce78dc7f4e159300c4827eed98b857fd6b59884" exitCode=0 Mar 17 01:24:20 crc kubenswrapper[4735]: I0317 01:24:20.570343 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerDied","Data":"416df46cf541571a9de20ee02ce78dc7f4e159300c4827eed98b857fd6b59884"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.579949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"44c95761ea45973f64d2767e6639a7d511159ecf1f356f1a64928d8375b49699"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"5bd23c6c2938f0e74a1f3baaf6982dc23dfad21eb478f36db93214ee33738213"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"f8df79e63b604782d1b322846f7059b6fcb0cea5d53e63ce2debf0783d6b2d3e"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"f9cb4bf92297cc26dbb178e686b2492c09bca858ce9e4a80e33778182dfc4d8c"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"11d6ed4409f410a1bddf090fd93733e9d12dda16ee8d35b3ea3e54126bd3c1be"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580315 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfcw2" event={"ID":"899c4d8c-fe75-4189-af67-c3edbd89d3fc","Type":"ContainerStarted","Data":"3bbb2a9beb36c6bd06bc90b2a0cd8956c56b059d7dac6a58e8b35bfdb2cf1a13"} Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.580364 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:21 crc kubenswrapper[4735]: I0317 01:24:21.598780 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pfcw2" podStartSLOduration=4.868789219 podStartE2EDuration="12.598763837s" podCreationTimestamp="2026-03-17 01:24:09 +0000 UTC" firstStartedPulling="2026-03-17 01:24:10.117242978 +0000 UTC m=+875.749475956" lastFinishedPulling="2026-03-17 01:24:17.847217586 +0000 UTC m=+883.479450574" observedRunningTime="2026-03-17 01:24:21.596876981 +0000 UTC m=+887.229109959" watchObservedRunningTime="2026-03-17 01:24:21.598763837 +0000 UTC m=+887.230996815" Mar 17 01:24:24 crc kubenswrapper[4735]: I0317 01:24:24.983924 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:25 crc kubenswrapper[4735]: I0317 01:24:25.047002 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:30 crc kubenswrapper[4735]: I0317 01:24:30.001085 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c667b" Mar 17 01:24:30 crc kubenswrapper[4735]: I0317 01:24:30.702127 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-788lk" Mar 17 01:24:31 crc kubenswrapper[4735]: I0317 01:24:31.890251 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-slwx6" Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.887779 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.889277 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.891685 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.892119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-r92zk" Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.892283 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 17 01:24:34 crc kubenswrapper[4735]: I0317 01:24:34.963192 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.082953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvqq\" (UniqueName: \"kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq\") pod \"openstack-operator-index-5kmcm\" (UID: \"de49487e-78f2-48bf-8056-01c129f23e53\") " pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.183683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvqq\" (UniqueName: \"kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq\") pod \"openstack-operator-index-5kmcm\" (UID: \"de49487e-78f2-48bf-8056-01c129f23e53\") " pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.197741 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.208381 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.227776 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvqq\" (UniqueName: \"kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq\") pod \"openstack-operator-index-5kmcm\" (UID: \"de49487e-78f2-48bf-8056-01c129f23e53\") " pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.511640 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-r92zk" Mar 17 01:24:35 crc kubenswrapper[4735]: I0317 01:24:35.519105 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:36 crc kubenswrapper[4735]: I0317 01:24:36.000662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:36 crc kubenswrapper[4735]: W0317 01:24:36.007059 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde49487e_78f2_48bf_8056_01c129f23e53.slice/crio-35136807fef3d7ebbaf9b2119417dd94406d0ef941286efdb3c8929b45200396 WatchSource:0}: Error finding container 35136807fef3d7ebbaf9b2119417dd94406d0ef941286efdb3c8929b45200396: Status 404 returned error can't find the container with id 35136807fef3d7ebbaf9b2119417dd94406d0ef941286efdb3c8929b45200396 Mar 17 01:24:36 crc kubenswrapper[4735]: I0317 01:24:36.356814 4735 scope.go:117] "RemoveContainer" containerID="1a612cfa9aafce66665a794064c741dec915e254c572dab5974f01aabe68560e" Mar 17 01:24:36 crc kubenswrapper[4735]: I0317 01:24:36.714340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kmcm" event={"ID":"de49487e-78f2-48bf-8056-01c129f23e53","Type":"ContainerStarted","Data":"35136807fef3d7ebbaf9b2119417dd94406d0ef941286efdb3c8929b45200396"} Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.236784 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.738647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kmcm" event={"ID":"de49487e-78f2-48bf-8056-01c129f23e53","Type":"ContainerStarted","Data":"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1"} Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.738851 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5kmcm" podUID="de49487e-78f2-48bf-8056-01c129f23e53" containerName="registry-server" containerID="cri-o://682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1" gracePeriod=2 Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.768897 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5kmcm" podStartSLOduration=2.428395526 podStartE2EDuration="4.768851478s" podCreationTimestamp="2026-03-17 01:24:34 +0000 UTC" firstStartedPulling="2026-03-17 01:24:36.009339088 +0000 UTC m=+901.641572096" lastFinishedPulling="2026-03-17 01:24:38.34979507 +0000 UTC m=+903.982028048" observedRunningTime="2026-03-17 01:24:38.763398553 +0000 UTC m=+904.395631561" watchObservedRunningTime="2026-03-17 01:24:38.768851478 +0000 UTC m=+904.401084496" Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.854420 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-djqxj"] Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.862775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:38 crc kubenswrapper[4735]: I0317 01:24:38.871114 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-djqxj"] Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.035148 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgp7\" (UniqueName: \"kubernetes.io/projected/0b8682ae-cefe-457b-b2b2-76753bc1db5f-kube-api-access-dwgp7\") pod \"openstack-operator-index-djqxj\" (UID: \"0b8682ae-cefe-457b-b2b2-76753bc1db5f\") " pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.137169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgp7\" (UniqueName: \"kubernetes.io/projected/0b8682ae-cefe-457b-b2b2-76753bc1db5f-kube-api-access-dwgp7\") pod \"openstack-operator-index-djqxj\" (UID: \"0b8682ae-cefe-457b-b2b2-76753bc1db5f\") " pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.162722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgp7\" (UniqueName: \"kubernetes.io/projected/0b8682ae-cefe-457b-b2b2-76753bc1db5f-kube-api-access-dwgp7\") pod \"openstack-operator-index-djqxj\" (UID: \"0b8682ae-cefe-457b-b2b2-76753bc1db5f\") " pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.208135 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.215269 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.339776 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hvqq\" (UniqueName: \"kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq\") pod \"de49487e-78f2-48bf-8056-01c129f23e53\" (UID: \"de49487e-78f2-48bf-8056-01c129f23e53\") " Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.344638 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq" (OuterVolumeSpecName: "kube-api-access-7hvqq") pod "de49487e-78f2-48bf-8056-01c129f23e53" (UID: "de49487e-78f2-48bf-8056-01c129f23e53"). InnerVolumeSpecName "kube-api-access-7hvqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.441648 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hvqq\" (UniqueName: \"kubernetes.io/projected/de49487e-78f2-48bf-8056-01c129f23e53-kube-api-access-7hvqq\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.716600 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-djqxj"] Mar 17 01:24:39 crc kubenswrapper[4735]: W0317 01:24:39.728673 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b8682ae_cefe_457b_b2b2_76753bc1db5f.slice/crio-005c0ab248b0d37668e51d01831756acb351c4ad8d0636dfec6e0165d0661296 WatchSource:0}: Error finding container 005c0ab248b0d37668e51d01831756acb351c4ad8d0636dfec6e0165d0661296: Status 404 returned error can't find the container with id 005c0ab248b0d37668e51d01831756acb351c4ad8d0636dfec6e0165d0661296 Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.755382 4735 generic.go:334] "Generic (PLEG): container finished" podID="de49487e-78f2-48bf-8056-01c129f23e53" containerID="682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1" exitCode=0 Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.755443 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kmcm" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.755504 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kmcm" event={"ID":"de49487e-78f2-48bf-8056-01c129f23e53","Type":"ContainerDied","Data":"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1"} Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.760634 4735 scope.go:117] "RemoveContainer" containerID="682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.760583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kmcm" event={"ID":"de49487e-78f2-48bf-8056-01c129f23e53","Type":"ContainerDied","Data":"35136807fef3d7ebbaf9b2119417dd94406d0ef941286efdb3c8929b45200396"} Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.763743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-djqxj" event={"ID":"0b8682ae-cefe-457b-b2b2-76753bc1db5f","Type":"ContainerStarted","Data":"005c0ab248b0d37668e51d01831756acb351c4ad8d0636dfec6e0165d0661296"} Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.795508 4735 scope.go:117] "RemoveContainer" containerID="682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1" Mar 17 01:24:39 crc kubenswrapper[4735]: E0317 01:24:39.796658 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1\": container with ID starting with 682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1 not found: ID does not exist" containerID="682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.796693 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1"} err="failed to get container status \"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1\": rpc error: code = NotFound desc = could not find container \"682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1\": container with ID starting with 682f365fb6cdd595420367e3dad981792f14ce06d1fee25803d5477b128786c1 not found: ID does not exist" Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.806083 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.822405 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5kmcm"] Mar 17 01:24:39 crc kubenswrapper[4735]: I0317 01:24:39.993252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pfcw2" Mar 17 01:24:40 crc kubenswrapper[4735]: I0317 01:24:40.774077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-djqxj" event={"ID":"0b8682ae-cefe-457b-b2b2-76753bc1db5f","Type":"ContainerStarted","Data":"c64e77b433932ea55720d00ef75d7d0c1f637efba8ced14c5396ab00b0fceaf9"} Mar 17 01:24:40 crc kubenswrapper[4735]: I0317 01:24:40.799658 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-djqxj" podStartSLOduration=2.713037976 podStartE2EDuration="2.799633669s" podCreationTimestamp="2026-03-17 01:24:38 +0000 UTC" firstStartedPulling="2026-03-17 01:24:39.736097867 +0000 UTC m=+905.368330875" lastFinishedPulling="2026-03-17 01:24:39.82269358 +0000 UTC m=+905.454926568" observedRunningTime="2026-03-17 01:24:40.792103372 +0000 UTC m=+906.424336390" watchObservedRunningTime="2026-03-17 01:24:40.799633669 +0000 UTC m=+906.431866657" Mar 17 01:24:41 crc kubenswrapper[4735]: I0317 01:24:41.083629 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de49487e-78f2-48bf-8056-01c129f23e53" path="/var/lib/kubelet/pods/de49487e-78f2-48bf-8056-01c129f23e53/volumes" Mar 17 01:24:49 crc kubenswrapper[4735]: I0317 01:24:49.217085 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:49 crc kubenswrapper[4735]: I0317 01:24:49.219031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:49 crc kubenswrapper[4735]: I0317 01:24:49.261170 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:49 crc kubenswrapper[4735]: I0317 01:24:49.895460 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-djqxj" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.583317 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st"] Mar 17 01:24:56 crc kubenswrapper[4735]: E0317 01:24:56.583816 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de49487e-78f2-48bf-8056-01c129f23e53" containerName="registry-server" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.583828 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="de49487e-78f2-48bf-8056-01c129f23e53" containerName="registry-server" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.583954 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="de49487e-78f2-48bf-8056-01c129f23e53" containerName="registry-server" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.584669 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.595408 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5kqsb" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.600256 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st"] Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.693909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.694113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.694269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.796309 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.796403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.796513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.797414 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.797501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.833230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j\") pod \"7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:56 crc kubenswrapper[4735]: I0317 01:24:56.905502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:24:57 crc kubenswrapper[4735]: I0317 01:24:57.409694 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st"] Mar 17 01:24:57 crc kubenswrapper[4735]: I0317 01:24:57.916252 4735 generic.go:334] "Generic (PLEG): container finished" podID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerID="a0bdbc319c5c36e1ba76bd3decc26eb47e14bab7e41505e4a12bf92831808ae3" exitCode=0 Mar 17 01:24:57 crc kubenswrapper[4735]: I0317 01:24:57.916313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" event={"ID":"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed","Type":"ContainerDied","Data":"a0bdbc319c5c36e1ba76bd3decc26eb47e14bab7e41505e4a12bf92831808ae3"} Mar 17 01:24:57 crc kubenswrapper[4735]: I0317 01:24:57.916350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" event={"ID":"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed","Type":"ContainerStarted","Data":"17c89e7ef59b1fba3ab0aa276b8cd3aed88960c70a386611ebe9f58989bb0519"} Mar 17 01:24:58 crc kubenswrapper[4735]: I0317 01:24:58.922321 4735 generic.go:334] "Generic (PLEG): container finished" podID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerID="5692f6a4e19a0ff568ff32f40d0944cd188cf39cf2f749bd3de105bc26e1c174" exitCode=0 Mar 17 01:24:58 crc kubenswrapper[4735]: I0317 01:24:58.922511 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" event={"ID":"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed","Type":"ContainerDied","Data":"5692f6a4e19a0ff568ff32f40d0944cd188cf39cf2f749bd3de105bc26e1c174"} Mar 17 01:24:59 crc kubenswrapper[4735]: I0317 01:24:59.933758 4735 generic.go:334] "Generic (PLEG): container finished" podID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerID="cb99ed0b35ee35e3a321ee883ccf6d6efc27dc5a0ff71400e979e393b4ff22d3" exitCode=0 Mar 17 01:24:59 crc kubenswrapper[4735]: I0317 01:24:59.933874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" event={"ID":"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed","Type":"ContainerDied","Data":"cb99ed0b35ee35e3a321ee883ccf6d6efc27dc5a0ff71400e979e393b4ff22d3"} Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.304666 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.466882 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util\") pod \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.467011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j\") pod \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.467101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle\") pod \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\" (UID: \"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed\") " Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.468268 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle" (OuterVolumeSpecName: "bundle") pod "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" (UID: "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.475289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j" (OuterVolumeSpecName: "kube-api-access-57t2j") pod "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" (UID: "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed"). InnerVolumeSpecName "kube-api-access-57t2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.501326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util" (OuterVolumeSpecName: "util") pod "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" (UID: "2df0e81e-3a8a-4a41-947e-4a9a86ef50ed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.569270 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-util\") on node \"crc\" DevicePath \"\"" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.569319 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-kube-api-access-57t2j\") on node \"crc\" DevicePath \"\"" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.569340 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2df0e81e-3a8a-4a41-947e-4a9a86ef50ed-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.956201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" event={"ID":"2df0e81e-3a8a-4a41-947e-4a9a86ef50ed","Type":"ContainerDied","Data":"17c89e7ef59b1fba3ab0aa276b8cd3aed88960c70a386611ebe9f58989bb0519"} Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.956309 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17c89e7ef59b1fba3ab0aa276b8cd3aed88960c70a386611ebe9f58989bb0519" Mar 17 01:25:01 crc kubenswrapper[4735]: I0317 01:25:01.956238 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.105895 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj"] Mar 17 01:25:09 crc kubenswrapper[4735]: E0317 01:25:09.106641 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="extract" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.106656 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="extract" Mar 17 01:25:09 crc kubenswrapper[4735]: E0317 01:25:09.106676 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="util" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.106684 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="util" Mar 17 01:25:09 crc kubenswrapper[4735]: E0317 01:25:09.106698 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="pull" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.106707 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="pull" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.106849 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df0e81e-3a8a-4a41-947e-4a9a86ef50ed" containerName="extract" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.107293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.110255 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4hzcw" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.127793 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj"] Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.283005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qpq4\" (UniqueName: \"kubernetes.io/projected/f640bb25-8c1e-4718-9f44-dec9ab10fbb9-kube-api-access-2qpq4\") pod \"openstack-operator-controller-init-6597c75466-b4kfj\" (UID: \"f640bb25-8c1e-4718-9f44-dec9ab10fbb9\") " pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.385117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qpq4\" (UniqueName: \"kubernetes.io/projected/f640bb25-8c1e-4718-9f44-dec9ab10fbb9-kube-api-access-2qpq4\") pod \"openstack-operator-controller-init-6597c75466-b4kfj\" (UID: \"f640bb25-8c1e-4718-9f44-dec9ab10fbb9\") " pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.416428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qpq4\" (UniqueName: \"kubernetes.io/projected/f640bb25-8c1e-4718-9f44-dec9ab10fbb9-kube-api-access-2qpq4\") pod \"openstack-operator-controller-init-6597c75466-b4kfj\" (UID: \"f640bb25-8c1e-4718-9f44-dec9ab10fbb9\") " pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.427617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.794347 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj"] Mar 17 01:25:09 crc kubenswrapper[4735]: I0317 01:25:09.801597 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:25:10 crc kubenswrapper[4735]: I0317 01:25:10.017011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" event={"ID":"f640bb25-8c1e-4718-9f44-dec9ab10fbb9","Type":"ContainerStarted","Data":"0007bc0251d07f1328fb9887d0a00b7f435604d27c7cf0752bf87f262b38edc3"} Mar 17 01:25:12 crc kubenswrapper[4735]: I0317 01:25:12.606796 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:25:12 crc kubenswrapper[4735]: I0317 01:25:12.608316 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:25:15 crc kubenswrapper[4735]: I0317 01:25:15.054486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" event={"ID":"f640bb25-8c1e-4718-9f44-dec9ab10fbb9","Type":"ContainerStarted","Data":"4afb39323197f5058606b7d70b363363940947b7c651306f93d1667a468d3903"} Mar 17 01:25:15 crc kubenswrapper[4735]: I0317 01:25:15.055031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:15 crc kubenswrapper[4735]: I0317 01:25:15.107969 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" podStartSLOduration=1.903652702 podStartE2EDuration="6.107938995s" podCreationTimestamp="2026-03-17 01:25:09 +0000 UTC" firstStartedPulling="2026-03-17 01:25:09.801395773 +0000 UTC m=+935.433628751" lastFinishedPulling="2026-03-17 01:25:14.005682066 +0000 UTC m=+939.637915044" observedRunningTime="2026-03-17 01:25:15.101727892 +0000 UTC m=+940.733960910" watchObservedRunningTime="2026-03-17 01:25:15.107938995 +0000 UTC m=+940.740172013" Mar 17 01:25:19 crc kubenswrapper[4735]: I0317 01:25:19.431059 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6597c75466-b4kfj" Mar 17 01:25:42 crc kubenswrapper[4735]: I0317 01:25:42.606762 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:25:42 crc kubenswrapper[4735]: I0317 01:25:42.607250 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.765613 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.766797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.777679 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.778589 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.779155 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7mkp5" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.785933 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-kthxq" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.794684 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.815082 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.826901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.827605 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.836415 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6vtsc" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.844750 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.845536 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.851367 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2xzb2" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.851751 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.859665 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.867522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4v2\" (UniqueName: \"kubernetes.io/projected/33c29a1c-b6e1-4b71-a0ed-b9a8851a0558-kube-api-access-4n4v2\") pod \"barbican-operator-controller-manager-59bc569d95-wtjk5\" (UID: \"33c29a1c-b6e1-4b71-a0ed-b9a8851a0558\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.867710 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxwrd\" (UniqueName: \"kubernetes.io/projected/37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575-kube-api-access-dxwrd\") pod \"cinder-operator-controller-manager-8d58dc466-dsd6v\" (UID: \"37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.903823 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.904723 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.920417 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8kxg2" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.926315 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.927180 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.933064 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t54jb" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.945921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.963595 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970804 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxwrd\" (UniqueName: \"kubernetes.io/projected/37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575-kube-api-access-dxwrd\") pod \"cinder-operator-controller-manager-8d58dc466-dsd6v\" (UID: \"37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzw7d\" (UniqueName: \"kubernetes.io/projected/4b767f40-0d53-4067-a546-0f14da7659bc-kube-api-access-vzw7d\") pod \"designate-operator-controller-manager-588d4d986b-mrkzz\" (UID: \"4b767f40-0d53-4067-a546-0f14da7659bc\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970914 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4v2\" (UniqueName: \"kubernetes.io/projected/33c29a1c-b6e1-4b71-a0ed-b9a8851a0558-kube-api-access-4n4v2\") pod \"barbican-operator-controller-manager-59bc569d95-wtjk5\" (UID: \"33c29a1c-b6e1-4b71-a0ed-b9a8851a0558\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c86v\" (UniqueName: \"kubernetes.io/projected/a628f7da-d487-43c7-9965-5697505667fb-kube-api-access-8c86v\") pod \"horizon-operator-controller-manager-8464cc45fb-r4z8g\" (UID: \"a628f7da-d487-43c7-9965-5697505667fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5rq\" (UniqueName: \"kubernetes.io/projected/da31620d-afd3-4129-a12b-bfddaead4abd-kube-api-access-lq5rq\") pod \"glance-operator-controller-manager-79df6bcc97-2tj58\" (UID: \"da31620d-afd3-4129-a12b-bfddaead4abd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.970980 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8dl\" (UniqueName: \"kubernetes.io/projected/250680a5-b697-4c2b-9180-3919204f246e-kube-api-access-jq8dl\") pod \"heat-operator-controller-manager-67dd5f86f5-2gnd8\" (UID: \"250680a5-b697-4c2b-9180-3919204f246e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.973945 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.974678 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.977630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.977811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f9shg" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.981374 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw"] Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.983844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.989161 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vxkkk" Mar 17 01:25:56 crc kubenswrapper[4735]: I0317 01:25:56.993220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.002961 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.005685 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4v2\" (UniqueName: \"kubernetes.io/projected/33c29a1c-b6e1-4b71-a0ed-b9a8851a0558-kube-api-access-4n4v2\") pod \"barbican-operator-controller-manager-59bc569d95-wtjk5\" (UID: \"33c29a1c-b6e1-4b71-a0ed-b9a8851a0558\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.011635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxwrd\" (UniqueName: \"kubernetes.io/projected/37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575-kube-api-access-dxwrd\") pod \"cinder-operator-controller-manager-8d58dc466-dsd6v\" (UID: \"37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.021282 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.022128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.029494 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4hxp2" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.051999 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxm2\" (UniqueName: \"kubernetes.io/projected/8c6fa0c7-2f56-4498-87ee-7ae0f64f262e-kube-api-access-8cxm2\") pod \"ironic-operator-controller-manager-6f787dddc9-t6nkw\" (UID: \"8c6fa0c7-2f56-4498-87ee-7ae0f64f262e\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw8r\" (UniqueName: \"kubernetes.io/projected/b596dfa6-ef2c-4c3c-80fb-f18229f7b99f-kube-api-access-clw8r\") pod \"keystone-operator-controller-manager-768b96df4c-vm55l\" (UID: \"b596dfa6-ef2c-4c3c-80fb-f18229f7b99f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077426 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29kd\" (UniqueName: \"kubernetes.io/projected/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-kube-api-access-g29kd\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c86v\" (UniqueName: \"kubernetes.io/projected/a628f7da-d487-43c7-9965-5697505667fb-kube-api-access-8c86v\") pod \"horizon-operator-controller-manager-8464cc45fb-r4z8g\" (UID: \"a628f7da-d487-43c7-9965-5697505667fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077497 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5rq\" (UniqueName: \"kubernetes.io/projected/da31620d-afd3-4129-a12b-bfddaead4abd-kube-api-access-lq5rq\") pod \"glance-operator-controller-manager-79df6bcc97-2tj58\" (UID: \"da31620d-afd3-4129-a12b-bfddaead4abd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077515 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8dl\" (UniqueName: \"kubernetes.io/projected/250680a5-b697-4c2b-9180-3919204f246e-kube-api-access-jq8dl\") pod \"heat-operator-controller-manager-67dd5f86f5-2gnd8\" (UID: \"250680a5-b697-4c2b-9180-3919204f246e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077560 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.077599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzw7d\" (UniqueName: \"kubernetes.io/projected/4b767f40-0d53-4067-a546-0f14da7659bc-kube-api-access-vzw7d\") pod \"designate-operator-controller-manager-588d4d986b-mrkzz\" (UID: \"4b767f40-0d53-4067-a546-0f14da7659bc\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.085425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.103187 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.104540 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.105227 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.113898 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.114672 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4wvpw" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.115531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5rq\" (UniqueName: \"kubernetes.io/projected/da31620d-afd3-4129-a12b-bfddaead4abd-kube-api-access-lq5rq\") pod \"glance-operator-controller-manager-79df6bcc97-2tj58\" (UID: \"da31620d-afd3-4129-a12b-bfddaead4abd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.116156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzw7d\" (UniqueName: \"kubernetes.io/projected/4b767f40-0d53-4067-a546-0f14da7659bc-kube-api-access-vzw7d\") pod \"designate-operator-controller-manager-588d4d986b-mrkzz\" (UID: \"4b767f40-0d53-4067-a546-0f14da7659bc\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.119000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.143012 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mgx6c" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.145840 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.146728 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c86v\" (UniqueName: \"kubernetes.io/projected/a628f7da-d487-43c7-9965-5697505667fb-kube-api-access-8c86v\") pod \"horizon-operator-controller-manager-8464cc45fb-r4z8g\" (UID: \"a628f7da-d487-43c7-9965-5697505667fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.156636 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.178869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180664 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpsjn\" (UniqueName: \"kubernetes.io/projected/4701b52d-3086-4792-bc35-c51cf4d63ad8-kube-api-access-fpsjn\") pod \"mariadb-operator-controller-manager-67ccfc9778-gwwjh\" (UID: \"4701b52d-3086-4792-bc35-c51cf4d63ad8\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180761 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld59b\" (UniqueName: \"kubernetes.io/projected/9c858fbe-58b8-4dea-91e5-05366d1bd648-kube-api-access-ld59b\") pod \"manila-operator-controller-manager-55f864c847-7cjd9\" (UID: \"9c858fbe-58b8-4dea-91e5-05366d1bd648\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxm2\" (UniqueName: \"kubernetes.io/projected/8c6fa0c7-2f56-4498-87ee-7ae0f64f262e-kube-api-access-8cxm2\") pod \"ironic-operator-controller-manager-6f787dddc9-t6nkw\" (UID: \"8c6fa0c7-2f56-4498-87ee-7ae0f64f262e\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180810 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clw8r\" (UniqueName: \"kubernetes.io/projected/b596dfa6-ef2c-4c3c-80fb-f18229f7b99f-kube-api-access-clw8r\") pod \"keystone-operator-controller-manager-768b96df4c-vm55l\" (UID: \"b596dfa6-ef2c-4c3c-80fb-f18229f7b99f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.180833 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29kd\" (UniqueName: \"kubernetes.io/projected/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-kube-api-access-g29kd\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.181832 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8dl\" (UniqueName: \"kubernetes.io/projected/250680a5-b697-4c2b-9180-3919204f246e-kube-api-access-jq8dl\") pod \"heat-operator-controller-manager-67dd5f86f5-2gnd8\" (UID: \"250680a5-b697-4c2b-9180-3919204f246e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.181975 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.182014 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:57.682000296 +0000 UTC m=+983.314233274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.193948 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.208986 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.209712 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.211574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxm2\" (UniqueName: \"kubernetes.io/projected/8c6fa0c7-2f56-4498-87ee-7ae0f64f262e-kube-api-access-8cxm2\") pod \"ironic-operator-controller-manager-6f787dddc9-t6nkw\" (UID: \"8c6fa0c7-2f56-4498-87ee-7ae0f64f262e\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.219622 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qxbn8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.235522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw8r\" (UniqueName: \"kubernetes.io/projected/b596dfa6-ef2c-4c3c-80fb-f18229f7b99f-kube-api-access-clw8r\") pod \"keystone-operator-controller-manager-768b96df4c-vm55l\" (UID: \"b596dfa6-ef2c-4c3c-80fb-f18229f7b99f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.254003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29kd\" (UniqueName: \"kubernetes.io/projected/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-kube-api-access-g29kd\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.254595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.294054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpsjn\" (UniqueName: \"kubernetes.io/projected/4701b52d-3086-4792-bc35-c51cf4d63ad8-kube-api-access-fpsjn\") pod \"mariadb-operator-controller-manager-67ccfc9778-gwwjh\" (UID: \"4701b52d-3086-4792-bc35-c51cf4d63ad8\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.294146 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld59b\" (UniqueName: \"kubernetes.io/projected/9c858fbe-58b8-4dea-91e5-05366d1bd648-kube-api-access-ld59b\") pod \"manila-operator-controller-manager-55f864c847-7cjd9\" (UID: \"9c858fbe-58b8-4dea-91e5-05366d1bd648\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.295057 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.307652 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.341212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld59b\" (UniqueName: \"kubernetes.io/projected/9c858fbe-58b8-4dea-91e5-05366d1bd648-kube-api-access-ld59b\") pod \"manila-operator-controller-manager-55f864c847-7cjd9\" (UID: \"9c858fbe-58b8-4dea-91e5-05366d1bd648\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.341733 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.350433 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpsjn\" (UniqueName: \"kubernetes.io/projected/4701b52d-3086-4792-bc35-c51cf4d63ad8-kube-api-access-fpsjn\") pod \"mariadb-operator-controller-manager-67ccfc9778-gwwjh\" (UID: \"4701b52d-3086-4792-bc35-c51cf4d63ad8\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.350527 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.352192 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.355339 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k7c9l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.359624 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.361425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.363161 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.363872 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-48x6v" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.371699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.395044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q985l\" (UniqueName: \"kubernetes.io/projected/486810ee-5bdf-451a-bc69-179723bbe75d-kube-api-access-q985l\") pod \"nova-operator-controller-manager-5d488d59fb-lhsm8\" (UID: \"486810ee-5bdf-451a-bc69-179723bbe75d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.395105 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchnf\" (UniqueName: \"kubernetes.io/projected/eefbeadd-f37e-4419-8d66-ba1731016fd0-kube-api-access-tchnf\") pod \"octavia-operator-controller-manager-5b9f45d989-kklq4\" (UID: \"eefbeadd-f37e-4419-8d66-ba1731016fd0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.395166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gggx5\" (UniqueName: \"kubernetes.io/projected/6306f8c1-066b-46e3-b76c-5490680c0ae3-kube-api-access-gggx5\") pod \"neutron-operator-controller-manager-767865f676-4dmmd\" (UID: \"6306f8c1-066b-46e3-b76c-5490680c0ae3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.485036 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.495920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchnf\" (UniqueName: \"kubernetes.io/projected/eefbeadd-f37e-4419-8d66-ba1731016fd0-kube-api-access-tchnf\") pod \"octavia-operator-controller-manager-5b9f45d989-kklq4\" (UID: \"eefbeadd-f37e-4419-8d66-ba1731016fd0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.497364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gggx5\" (UniqueName: \"kubernetes.io/projected/6306f8c1-066b-46e3-b76c-5490680c0ae3-kube-api-access-gggx5\") pod \"neutron-operator-controller-manager-767865f676-4dmmd\" (UID: \"6306f8c1-066b-46e3-b76c-5490680c0ae3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.497420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q985l\" (UniqueName: \"kubernetes.io/projected/486810ee-5bdf-451a-bc69-179723bbe75d-kube-api-access-q985l\") pod \"nova-operator-controller-manager-5d488d59fb-lhsm8\" (UID: \"486810ee-5bdf-451a-bc69-179723bbe75d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.504391 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.505236 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.511007 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.511700 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.521053 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.532764 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t572k" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.539287 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.561148 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.562444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.569534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q985l\" (UniqueName: \"kubernetes.io/projected/486810ee-5bdf-451a-bc69-179723bbe75d-kube-api-access-q985l\") pod \"nova-operator-controller-manager-5d488d59fb-lhsm8\" (UID: \"486810ee-5bdf-451a-bc69-179723bbe75d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.570677 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-65ch7" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.571396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchnf\" (UniqueName: \"kubernetes.io/projected/eefbeadd-f37e-4419-8d66-ba1731016fd0-kube-api-access-tchnf\") pod \"octavia-operator-controller-manager-5b9f45d989-kklq4\" (UID: \"eefbeadd-f37e-4419-8d66-ba1731016fd0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.574497 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gggx5\" (UniqueName: \"kubernetes.io/projected/6306f8c1-066b-46e3-b76c-5490680c0ae3-kube-api-access-gggx5\") pod \"neutron-operator-controller-manager-767865f676-4dmmd\" (UID: \"6306f8c1-066b-46e3-b76c-5490680c0ae3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.587705 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.599423 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2knm"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.600251 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.601823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4s9f\" (UniqueName: \"kubernetes.io/projected/1f207812-4855-4e6c-9c7d-64d45ac3c917-kube-api-access-p4s9f\") pod \"ovn-operator-controller-manager-884679f54-t8wv8\" (UID: \"1f207812-4855-4e6c-9c7d-64d45ac3c917\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.601882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwg4t\" (UniqueName: \"kubernetes.io/projected/77baae0d-2b5c-4b05-ab71-c87259ef645a-kube-api-access-wwg4t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.601916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9wc\" (UniqueName: \"kubernetes.io/projected/4ffd9139-49ec-4622-bed5-0744011e0469-kube-api-access-7m9wc\") pod \"swift-operator-controller-manager-c674c5965-v2knm\" (UID: \"4ffd9139-49ec-4622-bed5-0744011e0469\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.601939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.603674 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-257t8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.604559 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.626284 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-d8tnc" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.640607 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rhxw8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.660982 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.662999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.675317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-x4gr5" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.694160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.697754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713122 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713384 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdt2z\" (UniqueName: \"kubernetes.io/projected/ae2a2402-a926-4331-aed4-5b25fc55b9ba-kube-api-access-tdt2z\") pod \"placement-operator-controller-manager-5784578c99-257t8\" (UID: \"ae2a2402-a926-4331-aed4-5b25fc55b9ba\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwg4t\" (UniqueName: \"kubernetes.io/projected/77baae0d-2b5c-4b05-ab71-c87259ef645a-kube-api-access-wwg4t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713475 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9wc\" (UniqueName: \"kubernetes.io/projected/4ffd9139-49ec-4622-bed5-0744011e0469-kube-api-access-7m9wc\") pod \"swift-operator-controller-manager-c674c5965-v2knm\" (UID: \"4ffd9139-49ec-4622-bed5-0744011e0469\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqxr\" (UniqueName: \"kubernetes.io/projected/e786fada-7145-4c03-a0bb-073321237c38-kube-api-access-6sqxr\") pod \"telemetry-operator-controller-manager-d6b694c5-wnncr\" (UID: \"e786fada-7145-4c03-a0bb-073321237c38\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713608 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4s9f\" (UniqueName: \"kubernetes.io/projected/1f207812-4855-4e6c-9c7d-64d45ac3c917-kube-api-access-p4s9f\") pod \"ovn-operator-controller-manager-884679f54-t8wv8\" (UID: \"1f207812-4855-4e6c-9c7d-64d45ac3c917\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.713636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.713727 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.713769 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:58.713754829 +0000 UTC m=+984.345987807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.713837 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: E0317 01:25:57.713884 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:25:58.213854411 +0000 UTC m=+983.846087389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.765304 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2knm"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.774265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwg4t\" (UniqueName: \"kubernetes.io/projected/77baae0d-2b5c-4b05-ab71-c87259ef645a-kube-api-access-wwg4t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.784361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4s9f\" (UniqueName: \"kubernetes.io/projected/1f207812-4855-4e6c-9c7d-64d45ac3c917-kube-api-access-p4s9f\") pod \"ovn-operator-controller-manager-884679f54-t8wv8\" (UID: \"1f207812-4855-4e6c-9c7d-64d45ac3c917\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.804925 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-257t8"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.811444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9wc\" (UniqueName: \"kubernetes.io/projected/4ffd9139-49ec-4622-bed5-0744011e0469-kube-api-access-7m9wc\") pod \"swift-operator-controller-manager-c674c5965-v2knm\" (UID: \"4ffd9139-49ec-4622-bed5-0744011e0469\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.828028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqxr\" (UniqueName: \"kubernetes.io/projected/e786fada-7145-4c03-a0bb-073321237c38-kube-api-access-6sqxr\") pod \"telemetry-operator-controller-manager-d6b694c5-wnncr\" (UID: \"e786fada-7145-4c03-a0bb-073321237c38\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.828101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdt2z\" (UniqueName: \"kubernetes.io/projected/ae2a2402-a926-4331-aed4-5b25fc55b9ba-kube-api-access-tdt2z\") pod \"placement-operator-controller-manager-5784578c99-257t8\" (UID: \"ae2a2402-a926-4331-aed4-5b25fc55b9ba\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.841459 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.842302 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.857494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqxr\" (UniqueName: \"kubernetes.io/projected/e786fada-7145-4c03-a0bb-073321237c38-kube-api-access-6sqxr\") pod \"telemetry-operator-controller-manager-d6b694c5-wnncr\" (UID: \"e786fada-7145-4c03-a0bb-073321237c38\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.864946 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-m7v6h" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.872099 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.872936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.909522 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-cbr8s" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.918170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdt2z\" (UniqueName: \"kubernetes.io/projected/ae2a2402-a926-4331-aed4-5b25fc55b9ba-kube-api-access-tdt2z\") pod \"placement-operator-controller-manager-5784578c99-257t8\" (UID: \"ae2a2402-a926-4331-aed4-5b25fc55b9ba\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.926026 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.929451 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87pdr\" (UniqueName: \"kubernetes.io/projected/9e4382a5-eb49-4a6b-8ee0-4692cad88ae7-kube-api-access-87pdr\") pod \"test-operator-controller-manager-5c5cb9c4d7-p9zdz\" (UID: \"9e4382a5-eb49-4a6b-8ee0-4692cad88ae7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.929582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m28f\" (UniqueName: \"kubernetes.io/projected/f6c692fb-2aa5-47c4-8036-265ad9d63131-kube-api-access-5m28f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-txd6q\" (UID: \"f6c692fb-2aa5-47c4-8036-265ad9d63131\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.945002 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz"] Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.960409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:25:57 crc kubenswrapper[4735]: I0317 01:25:57.994119 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.019922 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.042437 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.043352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m28f\" (UniqueName: \"kubernetes.io/projected/f6c692fb-2aa5-47c4-8036-265ad9d63131-kube-api-access-5m28f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-txd6q\" (UID: \"f6c692fb-2aa5-47c4-8036-265ad9d63131\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.043386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87pdr\" (UniqueName: \"kubernetes.io/projected/9e4382a5-eb49-4a6b-8ee0-4692cad88ae7-kube-api-access-87pdr\") pod \"test-operator-controller-manager-5c5cb9c4d7-p9zdz\" (UID: \"9e4382a5-eb49-4a6b-8ee0-4692cad88ae7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.052621 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.071228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m28f\" (UniqueName: \"kubernetes.io/projected/f6c692fb-2aa5-47c4-8036-265ad9d63131-kube-api-access-5m28f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-txd6q\" (UID: \"f6c692fb-2aa5-47c4-8036-265ad9d63131\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.100554 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.101426 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.103741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87pdr\" (UniqueName: \"kubernetes.io/projected/9e4382a5-eb49-4a6b-8ee0-4692cad88ae7-kube-api-access-87pdr\") pod \"test-operator-controller-manager-5c5cb9c4d7-p9zdz\" (UID: \"9e4382a5-eb49-4a6b-8ee0-4692cad88ae7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.109067 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.109341 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-t52jc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.109971 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.144087 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8sbq\" (UniqueName: \"kubernetes.io/projected/9a63ed64-5e87-4f3d-8568-284237818e90-kube-api-access-s8sbq\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.144149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.144190 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.150022 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.228802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.234706 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.239404 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.244763 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-m5892" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.245696 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.245789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.245840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8sbq\" (UniqueName: \"kubernetes.io/projected/9a63ed64-5e87-4f3d-8568-284237818e90-kube-api-access-s8sbq\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.245914 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246067 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246131 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:58.746112135 +0000 UTC m=+984.378345113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246467 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246525 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:25:59.246507405 +0000 UTC m=+984.878740383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246531 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.246557 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:58.746549826 +0000 UTC m=+984.378782804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.246575 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.259500 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.326847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8sbq\" (UniqueName: \"kubernetes.io/projected/9a63ed64-5e87-4f3d-8568-284237818e90-kube-api-access-s8sbq\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.356511 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbv67\" (UniqueName: \"kubernetes.io/projected/c86b3820-e6e8-41e2-85f4-69a8fda476a2-kube-api-access-xbv67\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5brsc\" (UID: \"c86b3820-e6e8-41e2-85f4-69a8fda476a2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.408240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" event={"ID":"37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575","Type":"ContainerStarted","Data":"a87deb3ec47fd0bf9cae3fad05a758ee7be1ca125b8d9a6d007dbca8e0f45003"} Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.457336 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbv67\" (UniqueName: \"kubernetes.io/projected/c86b3820-e6e8-41e2-85f4-69a8fda476a2-kube-api-access-xbv67\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5brsc\" (UID: \"c86b3820-e6e8-41e2-85f4-69a8fda476a2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.507439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbv67\" (UniqueName: \"kubernetes.io/projected/c86b3820-e6e8-41e2-85f4-69a8fda476a2-kube-api-access-xbv67\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5brsc\" (UID: \"c86b3820-e6e8-41e2-85f4-69a8fda476a2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.507513 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.517986 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.529224 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.661910 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.761678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.761719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.761756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.761932 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.761981 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:59.76196696 +0000 UTC m=+985.394199938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.762283 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.762311 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:25:59.762305059 +0000 UTC m=+985.394538037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.762345 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: E0317 01:25:58.762362 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:00.76235712 +0000 UTC m=+986.394590098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.818083 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz"] Mar 17 01:25:58 crc kubenswrapper[4735]: I0317 01:25:58.837675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.279579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.279948 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.279997 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:26:01.279981187 +0000 UTC m=+986.912214165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.298896 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.326614 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.340935 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.355056 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2knm"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.357660 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.378836 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4"] Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.386456 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda628f7da_d487_43c7_9965_5697505667fb.slice/crio-5ed05fee4e88f8c76118f310638a4e31cf69ea169b7ed53adf8dc07f68a506ba WatchSource:0}: Error finding container 5ed05fee4e88f8c76118f310638a4e31cf69ea169b7ed53adf8dc07f68a506ba: Status 404 returned error can't find the container with id 5ed05fee4e88f8c76118f310638a4e31cf69ea169b7ed53adf8dc07f68a506ba Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.387974 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.425170 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9"] Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.432472 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeefbeadd_f37e_4419_8d66_ba1731016fd0.slice/crio-67facee9001878997c54ace857285a25446190a683a92c5d6ef8ee77a186f392 WatchSource:0}: Error finding container 67facee9001878997c54ace857285a25446190a683a92c5d6ef8ee77a186f392: Status 404 returned error can't find the container with id 67facee9001878997c54ace857285a25446190a683a92c5d6ef8ee77a186f392 Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.443639 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.453573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" event={"ID":"6306f8c1-066b-46e3-b76c-5490680c0ae3","Type":"ContainerStarted","Data":"8baa3e8220206819563f5413d6b928e9dd9bb110f842a097013e1eba427b76bc"} Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.457329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" event={"ID":"250680a5-b697-4c2b-9180-3919204f246e","Type":"ContainerStarted","Data":"b3bf7dd6d83e6b3d6cb3b6ac52b6f76e04bfad1774108c3244b4f60d65660418"} Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.459749 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" event={"ID":"8c6fa0c7-2f56-4498-87ee-7ae0f64f262e","Type":"ContainerStarted","Data":"60884f4d6898ceabd8806b62ff6f09df26584cda0aaf9700eaebe96b802f0638"} Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.462832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" event={"ID":"da31620d-afd3-4129-a12b-bfddaead4abd","Type":"ContainerStarted","Data":"c73b5935755a55b01ae112295142904bf09d92dfce403072364a9c711ac2c2a0"} Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.465001 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.468115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" event={"ID":"1f207812-4855-4e6c-9c7d-64d45ac3c917","Type":"ContainerStarted","Data":"b1265c5636c0a269e6572365054f1591ce85237377093b74193cd918bea791ce"} Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.470636 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2a2402_a926_4331_aed4_5b25fc55b9ba.slice/crio-79a4098f31efcd1f4883a4447231e0dccde765499f3c57d267d59dda4d8d07a2 WatchSource:0}: Error finding container 79a4098f31efcd1f4883a4447231e0dccde765499f3c57d267d59dda4d8d07a2: Status 404 returned error can't find the container with id 79a4098f31efcd1f4883a4447231e0dccde765499f3c57d267d59dda4d8d07a2 Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.473309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" event={"ID":"b596dfa6-ef2c-4c3c-80fb-f18229f7b99f","Type":"ContainerStarted","Data":"4eb2549d9dc64faa458bba906972d2b16937723170a3d2115807f4903c3a1933"} Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.474667 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr"] Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.476683 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6sqxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-wnncr_openstack-operators(e786fada-7145-4c03-a0bb-073321237c38): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.477685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" event={"ID":"4b767f40-0d53-4067-a546-0f14da7659bc","Type":"ContainerStarted","Data":"8b9dbf25f9149a7ec136d27b68fdf30c99b31f1977d14e45df863226f01bcc98"} Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.477759 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" podUID="e786fada-7145-4c03-a0bb-073321237c38" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.479402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" event={"ID":"33c29a1c-b6e1-4b71-a0ed-b9a8851a0558","Type":"ContainerStarted","Data":"550ce82f3063da561d22657acf1e6c81bf9cc998f22b2d9a74936b99f7dc4964"} Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.482546 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xbv67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5brsc_openstack-operators(c86b3820-e6e8-41e2-85f4-69a8fda476a2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.482690 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffd9139_49ec_4622_bed5_0744011e0469.slice/crio-4f9564e4c4e2d2d2cea2b357f32ac5a13757a20b477dc47523d4217013155d6e WatchSource:0}: Error finding container 4f9564e4c4e2d2d2cea2b357f32ac5a13757a20b477dc47523d4217013155d6e: Status 404 returned error can't find the container with id 4f9564e4c4e2d2d2cea2b357f32ac5a13757a20b477dc47523d4217013155d6e Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.482891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" event={"ID":"a628f7da-d487-43c7-9965-5697505667fb","Type":"ContainerStarted","Data":"5ed05fee4e88f8c76118f310638a4e31cf69ea169b7ed53adf8dc07f68a506ba"} Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.482987 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c692fb_2aa5_47c4_8036_265ad9d63131.slice/crio-25df486ca54da8e533f2ea4cfaf2d04c372f0d09f3f6719a03e10be879240c00 WatchSource:0}: Error finding container 25df486ca54da8e533f2ea4cfaf2d04c372f0d09f3f6719a03e10be879240c00: Status 404 returned error can't find the container with id 25df486ca54da8e533f2ea4cfaf2d04c372f0d09f3f6719a03e10be879240c00 Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.483660 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" podUID="c86b3820-e6e8-41e2-85f4-69a8fda476a2" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.485264 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-257t8"] Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.487218 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7m9wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-v2knm_openstack-operators(4ffd9139-49ec-4622-bed5-0744011e0469): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.488183 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5m28f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-txd6q_openstack-operators(f6c692fb-2aa5-47c4-8036-265ad9d63131): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.488331 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" podUID="4ffd9139-49ec-4622-bed5-0744011e0469" Mar 17 01:25:59 crc kubenswrapper[4735]: W0317 01:25:59.488796 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4382a5_eb49_4a6b_8ee0_4692cad88ae7.slice/crio-e97ed312e4b84aa7e7a57bcebf387f8fc121c1ca656909cfb2494c3d18815c89 WatchSource:0}: Error finding container e97ed312e4b84aa7e7a57bcebf387f8fc121c1ca656909cfb2494c3d18815c89: Status 404 returned error can't find the container with id e97ed312e4b84aa7e7a57bcebf387f8fc121c1ca656909cfb2494c3d18815c89 Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.491046 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" podUID="f6c692fb-2aa5-47c4-8036-265ad9d63131" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.493079 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q"] Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.495946 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-87pdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-p9zdz_openstack-operators(9e4382a5-eb49-4a6b-8ee0-4692cad88ae7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.496806 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh"] Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.497735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" podUID="9e4382a5-eb49-4a6b-8ee0-4692cad88ae7" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.522265 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc"] Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.796554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:59 crc kubenswrapper[4735]: I0317 01:25:59.796621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.796745 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.796780 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.796801 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:01.796786004 +0000 UTC m=+987.429018982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:25:59 crc kubenswrapper[4735]: E0317 01:25:59.796882 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:01.796849817 +0000 UTC m=+987.429082785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.126708 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561846-4jp52"] Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.133344 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.135583 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.137048 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.139481 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.155840 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-4jp52"] Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.204576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6lkv\" (UniqueName: \"kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv\") pod \"auto-csr-approver-29561846-4jp52\" (UID: \"890c33ed-65c3-4f76-9da9-99b3f3e8ef33\") " pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.305705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6lkv\" (UniqueName: \"kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv\") pod \"auto-csr-approver-29561846-4jp52\" (UID: \"890c33ed-65c3-4f76-9da9-99b3f3e8ef33\") " pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.344779 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6lkv\" (UniqueName: \"kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv\") pod \"auto-csr-approver-29561846-4jp52\" (UID: \"890c33ed-65c3-4f76-9da9-99b3f3e8ef33\") " pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.462221 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.512162 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" event={"ID":"f6c692fb-2aa5-47c4-8036-265ad9d63131","Type":"ContainerStarted","Data":"25df486ca54da8e533f2ea4cfaf2d04c372f0d09f3f6719a03e10be879240c00"} Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.513850 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" podUID="f6c692fb-2aa5-47c4-8036-265ad9d63131" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.514845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" event={"ID":"9e4382a5-eb49-4a6b-8ee0-4692cad88ae7","Type":"ContainerStarted","Data":"e97ed312e4b84aa7e7a57bcebf387f8fc121c1ca656909cfb2494c3d18815c89"} Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.532064 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" podUID="9e4382a5-eb49-4a6b-8ee0-4692cad88ae7" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.532211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" event={"ID":"4ffd9139-49ec-4622-bed5-0744011e0469","Type":"ContainerStarted","Data":"4f9564e4c4e2d2d2cea2b357f32ac5a13757a20b477dc47523d4217013155d6e"} Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.536983 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" podUID="4ffd9139-49ec-4622-bed5-0744011e0469" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.539054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" event={"ID":"4701b52d-3086-4792-bc35-c51cf4d63ad8","Type":"ContainerStarted","Data":"9d49010c350cb4b3e83b398965a8474257b94d859174cef447dd8d2465f909fc"} Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.561678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" event={"ID":"c86b3820-e6e8-41e2-85f4-69a8fda476a2","Type":"ContainerStarted","Data":"eaae9321f084df8e17a9245140ccc449d60b890c9dd5be6f25e5ec36c0a5d3e9"} Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.567242 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" podUID="c86b3820-e6e8-41e2-85f4-69a8fda476a2" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.573970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" event={"ID":"e786fada-7145-4c03-a0bb-073321237c38","Type":"ContainerStarted","Data":"cdca30b75a79dfebca02fadbb7d1059c63740a1a3a4a394dbf5c30bafdcc0971"} Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.581104 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" podUID="e786fada-7145-4c03-a0bb-073321237c38" Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.593388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" event={"ID":"ae2a2402-a926-4331-aed4-5b25fc55b9ba","Type":"ContainerStarted","Data":"79a4098f31efcd1f4883a4447231e0dccde765499f3c57d267d59dda4d8d07a2"} Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.598672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" event={"ID":"486810ee-5bdf-451a-bc69-179723bbe75d","Type":"ContainerStarted","Data":"8d6660a2aeaabade3a91071a2a2e27c6aa703de1366b2bb82af30c93d102b352"} Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.624128 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" event={"ID":"eefbeadd-f37e-4419-8d66-ba1731016fd0","Type":"ContainerStarted","Data":"67facee9001878997c54ace857285a25446190a683a92c5d6ef8ee77a186f392"} Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.625289 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" event={"ID":"9c858fbe-58b8-4dea-91e5-05366d1bd648","Type":"ContainerStarted","Data":"8161e539f8355d6cc2563a400538ce7aa36497aaabc0d1258d68ea78e038920c"} Mar 17 01:26:00 crc kubenswrapper[4735]: I0317 01:26:00.825531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.826019 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:00 crc kubenswrapper[4735]: E0317 01:26:00.826067 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:04.826050985 +0000 UTC m=+990.458283953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: I0317 01:26:01.222059 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-4jp52"] Mar 17 01:26:01 crc kubenswrapper[4735]: I0317 01:26:01.336668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.336804 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.336924 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:26:05.336908736 +0000 UTC m=+990.969141714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: I0317 01:26:01.660487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-4jp52" event={"ID":"890c33ed-65c3-4f76-9da9-99b3f3e8ef33","Type":"ContainerStarted","Data":"afd9eaf8ae470ee8fc0a24a82880912bc2f78ffca2f968b8a783524c85218d39"} Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.665821 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" podUID="f6c692fb-2aa5-47c4-8036-265ad9d63131" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.679281 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" podUID="9e4382a5-eb49-4a6b-8ee0-4692cad88ae7" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.679386 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" podUID="e786fada-7145-4c03-a0bb-073321237c38" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.679431 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" podUID="4ffd9139-49ec-4622-bed5-0744011e0469" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.679477 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" podUID="c86b3820-e6e8-41e2-85f4-69a8fda476a2" Mar 17 01:26:01 crc kubenswrapper[4735]: I0317 01:26:01.854503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:01 crc kubenswrapper[4735]: I0317 01:26:01.854570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.854710 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.854756 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:05.854740809 +0000 UTC m=+991.486973787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.855118 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:26:01 crc kubenswrapper[4735]: E0317 01:26:01.855141 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:05.8551343 +0000 UTC m=+991.487367278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:26:04 crc kubenswrapper[4735]: I0317 01:26:04.920826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:04 crc kubenswrapper[4735]: E0317 01:26:04.921015 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:04 crc kubenswrapper[4735]: E0317 01:26:04.921264 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:12.921245248 +0000 UTC m=+998.553478236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: I0317 01:26:05.426271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.426448 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.426491 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:26:13.426477833 +0000 UTC m=+999.058710811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: I0317 01:26:05.937461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.937692 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.939068 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:13.939052128 +0000 UTC m=+999.571285106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: I0317 01:26:05.939061 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.939217 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:26:05 crc kubenswrapper[4735]: E0317 01:26:05.939309 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:13.939284904 +0000 UTC m=+999.571517892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.538032 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d" Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.538814 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4n4v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-59bc569d95-wtjk5_openstack-operators(33c29a1c-b6e1-4b71-a0ed-b9a8851a0558): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.540042 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" podUID="33c29a1c-b6e1-4b71-a0ed-b9a8851a0558" Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.607568 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.608455 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.608641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.611528 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.611710 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110" gracePeriod=600 Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.809452 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110" exitCode=0 Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.809564 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110"} Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.809633 4735 scope.go:117] "RemoveContainer" containerID="89c7cb5343209fd70c941b27bbdf54f6b99a837d5fb5aeda971561981f6bf62a" Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.811131 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" podUID="33c29a1c-b6e1-4b71-a0ed-b9a8851a0558" Mar 17 01:26:12 crc kubenswrapper[4735]: I0317 01:26:12.942853 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.943186 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:12 crc kubenswrapper[4735]: E0317 01:26:12.943257 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert podName:5fc876da-b6d0-4c8b-ab2e-84558e5ba079 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:28.943235179 +0000 UTC m=+1014.575468187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert") pod "infra-operator-controller-manager-7b9c774f96-v78gc" (UID: "5fc876da-b6d0-4c8b-ab2e-84558e5ba079") : secret "infra-operator-webhook-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: I0317 01:26:13.456553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.456905 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.456958 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert podName:77baae0d-2b5c-4b05-ab71-c87259ef645a nodeName:}" failed. No retries permitted until 2026-03-17 01:26:29.456943321 +0000 UTC m=+1015.089176289 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fwv89" (UID: "77baae0d-2b5c-4b05-ab71-c87259ef645a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: I0317 01:26:13.968748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:13 crc kubenswrapper[4735]: I0317 01:26:13.968820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.968941 4735 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.968978 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.969024 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:29.969002942 +0000 UTC m=+1015.601235930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "metrics-server-cert" not found Mar 17 01:26:13 crc kubenswrapper[4735]: E0317 01:26:13.969046 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs podName:9a63ed64-5e87-4f3d-8568-284237818e90 nodeName:}" failed. No retries permitted until 2026-03-17 01:26:29.969036183 +0000 UTC m=+1015.601269171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs") pod "openstack-operator-controller-manager-576dc457f-z87tj" (UID: "9a63ed64-5e87-4f3d-8568-284237818e90") : secret "webhook-server-cert" not found Mar 17 01:26:15 crc kubenswrapper[4735]: E0317 01:26:15.806332 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 17 01:26:15 crc kubenswrapper[4735]: E0317 01:26:15.806793 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzw7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-mrkzz_openstack-operators(4b767f40-0d53-4067-a546-0f14da7659bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:15 crc kubenswrapper[4735]: E0317 01:26:15.808041 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" podUID="4b767f40-0d53-4067-a546-0f14da7659bc" Mar 17 01:26:15 crc kubenswrapper[4735]: E0317 01:26:15.832009 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" podUID="4b767f40-0d53-4067-a546-0f14da7659bc" Mar 17 01:26:17 crc kubenswrapper[4735]: E0317 01:26:17.184196 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 17 01:26:17 crc kubenswrapper[4735]: E0317 01:26:17.184535 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lq5rq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-2tj58_openstack-operators(da31620d-afd3-4129-a12b-bfddaead4abd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:17 crc kubenswrapper[4735]: E0317 01:26:17.186196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" podUID="da31620d-afd3-4129-a12b-bfddaead4abd" Mar 17 01:26:17 crc kubenswrapper[4735]: E0317 01:26:17.852244 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" podUID="da31620d-afd3-4129-a12b-bfddaead4abd" Mar 17 01:26:18 crc kubenswrapper[4735]: E0317 01:26:18.500869 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 17 01:26:18 crc kubenswrapper[4735]: E0317 01:26:18.501082 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ld59b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-7cjd9_openstack-operators(9c858fbe-58b8-4dea-91e5-05366d1bd648): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:18 crc kubenswrapper[4735]: E0317 01:26:18.502335 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" podUID="9c858fbe-58b8-4dea-91e5-05366d1bd648" Mar 17 01:26:18 crc kubenswrapper[4735]: E0317 01:26:18.861085 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" podUID="9c858fbe-58b8-4dea-91e5-05366d1bd648" Mar 17 01:26:19 crc kubenswrapper[4735]: E0317 01:26:19.433148 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a" Mar 17 01:26:19 crc kubenswrapper[4735]: E0317 01:26:19.433548 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tchnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-kklq4_openstack-operators(eefbeadd-f37e-4419-8d66-ba1731016fd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:19 crc kubenswrapper[4735]: E0317 01:26:19.435744 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" podUID="eefbeadd-f37e-4419-8d66-ba1731016fd0" Mar 17 01:26:19 crc kubenswrapper[4735]: E0317 01:26:19.864362 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" podUID="eefbeadd-f37e-4419-8d66-ba1731016fd0" Mar 17 01:26:20 crc kubenswrapper[4735]: E0317 01:26:20.324820 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 17 01:26:20 crc kubenswrapper[4735]: E0317 01:26:20.325036 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q985l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-lhsm8_openstack-operators(486810ee-5bdf-451a-bc69-179723bbe75d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:20 crc kubenswrapper[4735]: E0317 01:26:20.326113 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" podUID="486810ee-5bdf-451a-bc69-179723bbe75d" Mar 17 01:26:20 crc kubenswrapper[4735]: E0317 01:26:20.868023 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" podUID="486810ee-5bdf-451a-bc69-179723bbe75d" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.047598 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.048213 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gggx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-4dmmd_openstack-operators(6306f8c1-066b-46e3-b76c-5490680c0ae3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.049699 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" podUID="6306f8c1-066b-46e3-b76c-5490680c0ae3" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.607738 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.607898 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clw8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-vm55l_openstack-operators(b596dfa6-ef2c-4c3c-80fb-f18229f7b99f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.609051 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" podUID="b596dfa6-ef2c-4c3c-80fb-f18229f7b99f" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.875510 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" podUID="6306f8c1-066b-46e3-b76c-5490680c0ae3" Mar 17 01:26:21 crc kubenswrapper[4735]: E0317 01:26:21.876354 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" podUID="b596dfa6-ef2c-4c3c-80fb-f18229f7b99f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.405941 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.407238 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.412890 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.434156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.434212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcth\" (UniqueName: \"kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.434304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.535658 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.535706 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.535748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcth\" (UniqueName: \"kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.536331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.536456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.553529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcth\" (UniqueName: \"kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth\") pod \"redhat-marketplace-qdj6f\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:23 crc kubenswrapper[4735]: I0317 01:26:23.733933 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:26 crc kubenswrapper[4735]: I0317 01:26:26.923218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a"} Mar 17 01:26:26 crc kubenswrapper[4735]: I0317 01:26:26.972647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:26 crc kubenswrapper[4735]: W0317 01:26:26.985841 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb4d4e5_39d8_42c5_83b7_d73cfba6095d.slice/crio-03108a9bff5ca86c76c3e1a6efd8d886f33a9d78660c24a61af3a594c03ec0a7 WatchSource:0}: Error finding container 03108a9bff5ca86c76c3e1a6efd8d886f33a9d78660c24a61af3a594c03ec0a7: Status 404 returned error can't find the container with id 03108a9bff5ca86c76c3e1a6efd8d886f33a9d78660c24a61af3a594c03ec0a7 Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.930848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" event={"ID":"1f207812-4855-4e6c-9c7d-64d45ac3c917","Type":"ContainerStarted","Data":"cf885a06c7d7bbed5f8bb1b4b0d66f84bf6f19098defa8f0092496e4b4657ec1"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.931221 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.932203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-4jp52" event={"ID":"890c33ed-65c3-4f76-9da9-99b3f3e8ef33","Type":"ContainerStarted","Data":"77fcb39e16c9768adfa0fa05624e0745133bc102fbdedae525200f364f4ba151"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.933447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" event={"ID":"4ffd9139-49ec-4622-bed5-0744011e0469","Type":"ContainerStarted","Data":"3df3e50a8af96391a448318df97e0b528632696dea8fcb763a9948080a896807"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.933648 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.934763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" event={"ID":"c86b3820-e6e8-41e2-85f4-69a8fda476a2","Type":"ContainerStarted","Data":"f44654030c02118e2c59720c3b7e265ab08812148ba72519a2d4078f04254447"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.936180 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" event={"ID":"9e4382a5-eb49-4a6b-8ee0-4692cad88ae7","Type":"ContainerStarted","Data":"da05e3c83cf6d8c9049652e36ad0031acd1794a8e126c98be2c051e56305d328"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.936533 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.937620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" event={"ID":"ae2a2402-a926-4331-aed4-5b25fc55b9ba","Type":"ContainerStarted","Data":"2807f72d80ad2e3c15c3b660e47de96fd3521510918dc5501aa6182c6d0126e6"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.937966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.939031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" event={"ID":"250680a5-b697-4c2b-9180-3919204f246e","Type":"ContainerStarted","Data":"90952e9391966b0643b1ed5b860ac3c9ec2431b92c8e742632ad951e870332ee"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.939363 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.940670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" event={"ID":"33c29a1c-b6e1-4b71-a0ed-b9a8851a0558","Type":"ContainerStarted","Data":"e98baa3e6196ced2e4553290e6a36da53030cd34b28930e3cfed7defece582f9"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.941010 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.942083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" event={"ID":"e786fada-7145-4c03-a0bb-073321237c38","Type":"ContainerStarted","Data":"f2c1d5fd24a5a5a89f03fbb9d8ea1e30285ca0ade4add5d5c51e600be7e25804"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.942383 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.943688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" event={"ID":"37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575","Type":"ContainerStarted","Data":"cfe64f316a0a54cc50833b3bb6925a539e9c1365fcc084b28a0f1ee0f00969f1"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.944076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.945152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" event={"ID":"8c6fa0c7-2f56-4498-87ee-7ae0f64f262e","Type":"ContainerStarted","Data":"ba8fece680128df092fe2167776ceb178b1c42efa9c847297a94f67c23df1863"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.945454 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.946541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" event={"ID":"a628f7da-d487-43c7-9965-5697505667fb","Type":"ContainerStarted","Data":"84733edf7dea17d7cc0a5f4e8f42a0f52d4d8105e3e410176677f815de2015e8"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.946870 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.947694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" event={"ID":"f6c692fb-2aa5-47c4-8036-265ad9d63131","Type":"ContainerStarted","Data":"66978a4718098ee1b2294e85673d71357d56789621949fbac1102804b5822e7d"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.948018 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.949185 4735 generic.go:334] "Generic (PLEG): container finished" podID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerID="235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3" exitCode=0 Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.949225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerDied","Data":"235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.949240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerStarted","Data":"03108a9bff5ca86c76c3e1a6efd8d886f33a9d78660c24a61af3a594c03ec0a7"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.951094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" event={"ID":"4701b52d-3086-4792-bc35-c51cf4d63ad8","Type":"ContainerStarted","Data":"5fb05e74d37682cdd25a5e8a1c47195989782386fd88af5e050963a815ce4ebe"} Mar 17 01:26:27 crc kubenswrapper[4735]: I0317 01:26:27.951210 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.149323 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" podStartSLOduration=8.991395535 podStartE2EDuration="31.149303567s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.427263631 +0000 UTC m=+985.059496609" lastFinishedPulling="2026-03-17 01:26:21.585171653 +0000 UTC m=+1007.217404641" observedRunningTime="2026-03-17 01:26:28.145302729 +0000 UTC m=+1013.777535707" watchObservedRunningTime="2026-03-17 01:26:28.149303567 +0000 UTC m=+1013.781536545" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.330143 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" podStartSLOduration=4.342080103 podStartE2EDuration="31.330126848s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.495801923 +0000 UTC m=+985.128034901" lastFinishedPulling="2026-03-17 01:26:26.483848658 +0000 UTC m=+1012.116081646" observedRunningTime="2026-03-17 01:26:28.251731106 +0000 UTC m=+1013.883964084" watchObservedRunningTime="2026-03-17 01:26:28.330126848 +0000 UTC m=+1013.962359816" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.330617 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5brsc" podStartSLOduration=4.168323554 podStartE2EDuration="31.330613711s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.482416026 +0000 UTC m=+985.114648994" lastFinishedPulling="2026-03-17 01:26:26.644706173 +0000 UTC m=+1012.276939151" observedRunningTime="2026-03-17 01:26:28.328355535 +0000 UTC m=+1013.960588513" watchObservedRunningTime="2026-03-17 01:26:28.330613711 +0000 UTC m=+1013.962846689" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.354574 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" podStartSLOduration=4.320826554 podStartE2EDuration="31.354559435s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.476523832 +0000 UTC m=+985.108756810" lastFinishedPulling="2026-03-17 01:26:26.510256713 +0000 UTC m=+1012.142489691" observedRunningTime="2026-03-17 01:26:28.350219959 +0000 UTC m=+1013.982452947" watchObservedRunningTime="2026-03-17 01:26:28.354559435 +0000 UTC m=+1013.986792403" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.444953 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" podStartSLOduration=4.448237193 podStartE2EDuration="31.444938379s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.488065714 +0000 UTC m=+985.120298692" lastFinishedPulling="2026-03-17 01:26:26.4847669 +0000 UTC m=+1012.116999878" observedRunningTime="2026-03-17 01:26:28.399218345 +0000 UTC m=+1014.031451323" watchObservedRunningTime="2026-03-17 01:26:28.444938379 +0000 UTC m=+1014.077171357" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.476639 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" podStartSLOduration=9.367791878 podStartE2EDuration="31.476622963s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.476363228 +0000 UTC m=+985.108596206" lastFinishedPulling="2026-03-17 01:26:21.585194313 +0000 UTC m=+1007.217427291" observedRunningTime="2026-03-17 01:26:28.473196699 +0000 UTC m=+1014.105429667" watchObservedRunningTime="2026-03-17 01:26:28.476622963 +0000 UTC m=+1014.108855941" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.478700 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" podStartSLOduration=4.312524426 podStartE2EDuration="32.478694013s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:58.519228008 +0000 UTC m=+984.151460986" lastFinishedPulling="2026-03-17 01:26:26.685397595 +0000 UTC m=+1012.317630573" observedRunningTime="2026-03-17 01:26:28.447609424 +0000 UTC m=+1014.079842402" watchObservedRunningTime="2026-03-17 01:26:28.478694013 +0000 UTC m=+1014.110926991" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.520787 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561846-4jp52" podStartSLOduration=3.878873769 podStartE2EDuration="28.52077098s" podCreationTimestamp="2026-03-17 01:26:00 +0000 UTC" firstStartedPulling="2026-03-17 01:26:01.284877407 +0000 UTC m=+986.917110385" lastFinishedPulling="2026-03-17 01:26:25.926774618 +0000 UTC m=+1011.559007596" observedRunningTime="2026-03-17 01:26:28.518176537 +0000 UTC m=+1014.150409515" watchObservedRunningTime="2026-03-17 01:26:28.52077098 +0000 UTC m=+1014.153003958" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.521109 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" podStartSLOduration=10.261661788 podStartE2EDuration="32.521105178s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.326795259 +0000 UTC m=+984.959028227" lastFinishedPulling="2026-03-17 01:26:21.586238629 +0000 UTC m=+1007.218471617" observedRunningTime="2026-03-17 01:26:28.503529059 +0000 UTC m=+1014.135762037" watchObservedRunningTime="2026-03-17 01:26:28.521105178 +0000 UTC m=+1014.153338156" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.553143 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" podStartSLOduration=9.53945925 podStartE2EDuration="32.553128879s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:58.572374845 +0000 UTC m=+984.204607823" lastFinishedPulling="2026-03-17 01:26:21.586044474 +0000 UTC m=+1007.218277452" observedRunningTime="2026-03-17 01:26:28.544837527 +0000 UTC m=+1014.177070505" watchObservedRunningTime="2026-03-17 01:26:28.553128879 +0000 UTC m=+1014.185361857" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.588770 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" podStartSLOduration=10.709979075 podStartE2EDuration="32.588755788s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:58.399245851 +0000 UTC m=+984.031478819" lastFinishedPulling="2026-03-17 01:26:20.278022554 +0000 UTC m=+1005.910255532" observedRunningTime="2026-03-17 01:26:28.584183607 +0000 UTC m=+1014.216416585" watchObservedRunningTime="2026-03-17 01:26:28.588755788 +0000 UTC m=+1014.220988766" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.641818 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" podStartSLOduration=9.529630285 podStartE2EDuration="31.641804702s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.473661952 +0000 UTC m=+985.105894930" lastFinishedPulling="2026-03-17 01:26:21.585836369 +0000 UTC m=+1007.218069347" observedRunningTime="2026-03-17 01:26:28.640727556 +0000 UTC m=+1014.272960534" watchObservedRunningTime="2026-03-17 01:26:28.641804702 +0000 UTC m=+1014.274037680" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.756406 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" podStartSLOduration=9.369693729 podStartE2EDuration="32.756385178s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.427943997 +0000 UTC m=+985.060176975" lastFinishedPulling="2026-03-17 01:26:22.814635426 +0000 UTC m=+1008.446868424" observedRunningTime="2026-03-17 01:26:28.695739449 +0000 UTC m=+1014.327972427" watchObservedRunningTime="2026-03-17 01:26:28.756385178 +0000 UTC m=+1014.388618156" Mar 17 01:26:28 crc kubenswrapper[4735]: I0317 01:26:28.770540 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" podStartSLOduration=4.705534779 podStartE2EDuration="31.770526383s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.48711154 +0000 UTC m=+985.119344518" lastFinishedPulling="2026-03-17 01:26:26.552103144 +0000 UTC m=+1012.184336122" observedRunningTime="2026-03-17 01:26:28.735286893 +0000 UTC m=+1014.367519861" watchObservedRunningTime="2026-03-17 01:26:28.770526383 +0000 UTC m=+1014.402759361" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.043595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.078647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc876da-b6d0-4c8b-ab2e-84558e5ba079-cert\") pod \"infra-operator-controller-manager-7b9c774f96-v78gc\" (UID: \"5fc876da-b6d0-4c8b-ab2e-84558e5ba079\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.098648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.408977 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.410234 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.421450 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.551175 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.551233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.551255 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.551327 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6xh\" (UniqueName: \"kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.556085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77baae0d-2b5c-4b05-ab71-c87259ef645a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fwv89\" (UID: \"77baae0d-2b5c-4b05-ab71-c87259ef645a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.652155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6xh\" (UniqueName: \"kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.652347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.652431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.652838 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.652910 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.668767 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6xh\" (UniqueName: \"kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh\") pod \"certified-operators-fzxpc\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.670588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.731603 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.968324 4735 generic.go:334] "Generic (PLEG): container finished" podID="890c33ed-65c3-4f76-9da9-99b3f3e8ef33" containerID="77fcb39e16c9768adfa0fa05624e0745133bc102fbdedae525200f364f4ba151" exitCode=0 Mar 17 01:26:29 crc kubenswrapper[4735]: I0317 01:26:29.968447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-4jp52" event={"ID":"890c33ed-65c3-4f76-9da9-99b3f3e8ef33","Type":"ContainerDied","Data":"77fcb39e16c9768adfa0fa05624e0745133bc102fbdedae525200f364f4ba151"} Mar 17 01:26:30 crc kubenswrapper[4735]: I0317 01:26:30.056580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:30 crc kubenswrapper[4735]: I0317 01:26:30.057950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:30 crc kubenswrapper[4735]: I0317 01:26:30.062385 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-metrics-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:30 crc kubenswrapper[4735]: I0317 01:26:30.076244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9a63ed64-5e87-4f3d-8568-284237818e90-webhook-certs\") pod \"openstack-operator-controller-manager-576dc457f-z87tj\" (UID: \"9a63ed64-5e87-4f3d-8568-284237818e90\") " pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:30 crc kubenswrapper[4735]: I0317 01:26:30.305326 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.396712 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.497051 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6lkv\" (UniqueName: \"kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv\") pod \"890c33ed-65c3-4f76-9da9-99b3f3e8ef33\" (UID: \"890c33ed-65c3-4f76-9da9-99b3f3e8ef33\") " Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.503599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv" (OuterVolumeSpecName: "kube-api-access-b6lkv") pod "890c33ed-65c3-4f76-9da9-99b3f3e8ef33" (UID: "890c33ed-65c3-4f76-9da9-99b3f3e8ef33"). InnerVolumeSpecName "kube-api-access-b6lkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.576226 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-gsft7"] Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.599176 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-gsft7"] Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.619651 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6lkv\" (UniqueName: \"kubernetes.io/projected/890c33ed-65c3-4f76-9da9-99b3f3e8ef33-kube-api-access-b6lkv\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.798543 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc"] Mar 17 01:26:31 crc kubenswrapper[4735]: W0317 01:26:31.825360 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc876da_b6d0_4c8b_ab2e_84558e5ba079.slice/crio-952bdfb8555fc483ff481e6f52d642b48fa7f93527ec3114d626562030e771e2 WatchSource:0}: Error finding container 952bdfb8555fc483ff481e6f52d642b48fa7f93527ec3114d626562030e771e2: Status 404 returned error can't find the container with id 952bdfb8555fc483ff481e6f52d642b48fa7f93527ec3114d626562030e771e2 Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.877659 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89"] Mar 17 01:26:31 crc kubenswrapper[4735]: W0317 01:26:31.886379 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77baae0d_2b5c_4b05_ab71_c87259ef645a.slice/crio-c4174f42cda3717327a2c60742ed5631fcefc3c83933f6caf5ff8d12f0b323c1 WatchSource:0}: Error finding container c4174f42cda3717327a2c60742ed5631fcefc3c83933f6caf5ff8d12f0b323c1: Status 404 returned error can't find the container with id c4174f42cda3717327a2c60742ed5631fcefc3c83933f6caf5ff8d12f0b323c1 Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.911660 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:31 crc kubenswrapper[4735]: W0317 01:26:31.923664 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99054d59_96b8_4888_9e8d_342e9425ab75.slice/crio-c64ca0d86c5736a26c2e4c1f453d94f93a0b3969f7211ec06d40b94e82ef2791 WatchSource:0}: Error finding container c64ca0d86c5736a26c2e4c1f453d94f93a0b3969f7211ec06d40b94e82ef2791: Status 404 returned error can't find the container with id c64ca0d86c5736a26c2e4c1f453d94f93a0b3969f7211ec06d40b94e82ef2791 Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.926017 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj"] Mar 17 01:26:31 crc kubenswrapper[4735]: W0317 01:26:31.938232 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a63ed64_5e87_4f3d_8568_284237818e90.slice/crio-1dc0cd3d4140f11170a7955af3c4b4125a06af1a27d81e93132ddd0250b3db67 WatchSource:0}: Error finding container 1dc0cd3d4140f11170a7955af3c4b4125a06af1a27d81e93132ddd0250b3db67: Status 404 returned error can't find the container with id 1dc0cd3d4140f11170a7955af3c4b4125a06af1a27d81e93132ddd0250b3db67 Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.989940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" event={"ID":"5fc876da-b6d0-4c8b-ab2e-84558e5ba079","Type":"ContainerStarted","Data":"952bdfb8555fc483ff481e6f52d642b48fa7f93527ec3114d626562030e771e2"} Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.991342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerStarted","Data":"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4"} Mar 17 01:26:31 crc kubenswrapper[4735]: I0317 01:26:31.994078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" event={"ID":"77baae0d-2b5c-4b05-ab71-c87259ef645a","Type":"ContainerStarted","Data":"c4174f42cda3717327a2c60742ed5631fcefc3c83933f6caf5ff8d12f0b323c1"} Mar 17 01:26:32 crc kubenswrapper[4735]: I0317 01:26:31.995607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerStarted","Data":"c64ca0d86c5736a26c2e4c1f453d94f93a0b3969f7211ec06d40b94e82ef2791"} Mar 17 01:26:32 crc kubenswrapper[4735]: I0317 01:26:31.997010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-4jp52" event={"ID":"890c33ed-65c3-4f76-9da9-99b3f3e8ef33","Type":"ContainerDied","Data":"afd9eaf8ae470ee8fc0a24a82880912bc2f78ffca2f968b8a783524c85218d39"} Mar 17 01:26:32 crc kubenswrapper[4735]: I0317 01:26:31.997050 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd9eaf8ae470ee8fc0a24a82880912bc2f78ffca2f968b8a783524c85218d39" Mar 17 01:26:32 crc kubenswrapper[4735]: I0317 01:26:31.997071 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-4jp52" Mar 17 01:26:32 crc kubenswrapper[4735]: I0317 01:26:31.998595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" event={"ID":"9a63ed64-5e87-4f3d-8568-284237818e90","Type":"ContainerStarted","Data":"1dc0cd3d4140f11170a7955af3c4b4125a06af1a27d81e93132ddd0250b3db67"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.015424 4735 generic.go:334] "Generic (PLEG): container finished" podID="99054d59-96b8-4888-9e8d-342e9425ab75" containerID="af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407" exitCode=0 Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.015527 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerDied","Data":"af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.020766 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" event={"ID":"eefbeadd-f37e-4419-8d66-ba1731016fd0","Type":"ContainerStarted","Data":"8d74a3806867f91399db55d10282cefa48dd3f929b0a97d5deafa76f35147651"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.021505 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.031120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" event={"ID":"4b767f40-0d53-4067-a546-0f14da7659bc","Type":"ContainerStarted","Data":"953ea41ab030fc4d714e63ae95c9502bb1aa541f1c6cf1c8506d67de4f1696a9"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.031365 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.049163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" event={"ID":"9c858fbe-58b8-4dea-91e5-05366d1bd648","Type":"ContainerStarted","Data":"a15dec13d4e79e0be12d394c7a08754471dbefdba6a3854055615088fb8e7e26"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.049741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.054598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" event={"ID":"9a63ed64-5e87-4f3d-8568-284237818e90","Type":"ContainerStarted","Data":"aec62aa31c3394765dbbecc29e230c1bf197e6d039cc3ffce7c1b1d75e87be95"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.054776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.071401 4735 generic.go:334] "Generic (PLEG): container finished" podID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerID="e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4" exitCode=0 Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.071472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerDied","Data":"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4"} Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.084127 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" podStartSLOduration=2.996818475 podStartE2EDuration="36.084113253s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.448191711 +0000 UTC m=+985.080424689" lastFinishedPulling="2026-03-17 01:26:32.535486479 +0000 UTC m=+1018.167719467" observedRunningTime="2026-03-17 01:26:33.082326358 +0000 UTC m=+1018.714559326" watchObservedRunningTime="2026-03-17 01:26:33.084113253 +0000 UTC m=+1018.716346231" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.098178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b61c1fb-1ab0-4aac-9086-c665a8521904" path="/var/lib/kubelet/pods/7b61c1fb-1ab0-4aac-9086-c665a8521904/volumes" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.105599 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" podStartSLOduration=3.156297649 podStartE2EDuration="37.105580086s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:58.841466609 +0000 UTC m=+984.473699587" lastFinishedPulling="2026-03-17 01:26:32.790749046 +0000 UTC m=+1018.422982024" observedRunningTime="2026-03-17 01:26:33.101055836 +0000 UTC m=+1018.733288814" watchObservedRunningTime="2026-03-17 01:26:33.105580086 +0000 UTC m=+1018.737813064" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.191692 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" podStartSLOduration=36.191677197 podStartE2EDuration="36.191677197s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:26:33.19017674 +0000 UTC m=+1018.822409718" watchObservedRunningTime="2026-03-17 01:26:33.191677197 +0000 UTC m=+1018.823910175" Mar 17 01:26:33 crc kubenswrapper[4735]: I0317 01:26:33.211496 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" podStartSLOduration=3.14418574 podStartE2EDuration="36.21148114s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.465251817 +0000 UTC m=+985.097484795" lastFinishedPulling="2026-03-17 01:26:32.532547187 +0000 UTC m=+1018.164780195" observedRunningTime="2026-03-17 01:26:33.209404339 +0000 UTC m=+1018.841637317" watchObservedRunningTime="2026-03-17 01:26:33.21148114 +0000 UTC m=+1018.843714108" Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.087881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerStarted","Data":"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2"} Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.103326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" event={"ID":"b596dfa6-ef2c-4c3c-80fb-f18229f7b99f","Type":"ContainerStarted","Data":"696be8a28303992c502bfc42eb361d71dd64bbe5dc17465444956131a8686883"} Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.103641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.106403 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" event={"ID":"da31620d-afd3-4129-a12b-bfddaead4abd","Type":"ContainerStarted","Data":"cb5d131bfc1a2fdf247687d5bfd674b3d30ac6220dc3126e59d7437f0d6c6d63"} Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.106704 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.124166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerStarted","Data":"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72"} Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.165771 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" podStartSLOduration=3.337444538 podStartE2EDuration="38.165753399s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:25:58.84151503 +0000 UTC m=+984.473748008" lastFinishedPulling="2026-03-17 01:26:33.669823891 +0000 UTC m=+1019.302056869" observedRunningTime="2026-03-17 01:26:34.158173315 +0000 UTC m=+1019.790406293" watchObservedRunningTime="2026-03-17 01:26:34.165753399 +0000 UTC m=+1019.797986377" Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.201282 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" podStartSLOduration=2.915305617 podStartE2EDuration="37.201264496s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.378363948 +0000 UTC m=+985.010596926" lastFinishedPulling="2026-03-17 01:26:33.664322807 +0000 UTC m=+1019.296555805" observedRunningTime="2026-03-17 01:26:34.187035748 +0000 UTC m=+1019.819268726" watchObservedRunningTime="2026-03-17 01:26:34.201264496 +0000 UTC m=+1019.833497474" Mar 17 01:26:34 crc kubenswrapper[4735]: I0317 01:26:34.205560 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qdj6f" podStartSLOduration=5.48796976 podStartE2EDuration="11.205549051s" podCreationTimestamp="2026-03-17 01:26:23 +0000 UTC" firstStartedPulling="2026-03-17 01:26:27.950227071 +0000 UTC m=+1013.582460049" lastFinishedPulling="2026-03-17 01:26:33.667806362 +0000 UTC m=+1019.300039340" observedRunningTime="2026-03-17 01:26:34.201995603 +0000 UTC m=+1019.834228581" watchObservedRunningTime="2026-03-17 01:26:34.205549051 +0000 UTC m=+1019.837782029" Mar 17 01:26:35 crc kubenswrapper[4735]: I0317 01:26:35.153461 4735 generic.go:334] "Generic (PLEG): container finished" podID="99054d59-96b8-4888-9e8d-342e9425ab75" containerID="d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2" exitCode=0 Mar 17 01:26:35 crc kubenswrapper[4735]: I0317 01:26:35.153562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerDied","Data":"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2"} Mar 17 01:26:36 crc kubenswrapper[4735]: I0317 01:26:36.640622 4735 scope.go:117] "RemoveContainer" containerID="9016f45133340a0c4eea3a5c2ef494fa622ee1ba2e7d882d048ebcd6930a55b8" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.088242 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wtjk5" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.105433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dsd6v" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.168449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" event={"ID":"77baae0d-2b5c-4b05-ab71-c87259ef645a","Type":"ContainerStarted","Data":"1f2f0c10d3f2d1ef9533f9d8e6e6f895aadfcb77e902bfe1e10a12126c41fa7d"} Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.169414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.170772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" event={"ID":"5fc876da-b6d0-4c8b-ab2e-84558e5ba079","Type":"ContainerStarted","Data":"3acac5430c77344ced32e6471e0a2b0659f57ba8c84c6f961a2bc690d54baff7"} Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.171029 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.178767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" event={"ID":"6306f8c1-066b-46e3-b76c-5490680c0ae3","Type":"ContainerStarted","Data":"c24c7b49378b34e5b7f82eb72a3ab6a396568f3af0e0e2ea36d6e96c2cda14c7"} Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.179354 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.181662 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mrkzz" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.182123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerStarted","Data":"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605"} Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.184934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" event={"ID":"486810ee-5bdf-451a-bc69-179723bbe75d","Type":"ContainerStarted","Data":"a75f08c75b181c6a2b3c0d0e494b5615e5de35a727108572be0fdaa8e0693efd"} Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.185295 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.214350 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" podStartSLOduration=35.874384717 podStartE2EDuration="40.214336191s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:26:31.891694384 +0000 UTC m=+1017.523927352" lastFinishedPulling="2026-03-17 01:26:36.231645848 +0000 UTC m=+1021.863878826" observedRunningTime="2026-03-17 01:26:37.207379302 +0000 UTC m=+1022.839612280" watchObservedRunningTime="2026-03-17 01:26:37.214336191 +0000 UTC m=+1022.846569169" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.239618 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" podStartSLOduration=36.802742011 podStartE2EDuration="41.239602938s" podCreationTimestamp="2026-03-17 01:25:56 +0000 UTC" firstStartedPulling="2026-03-17 01:26:31.828051923 +0000 UTC m=+1017.460284901" lastFinishedPulling="2026-03-17 01:26:36.26491285 +0000 UTC m=+1021.897145828" observedRunningTime="2026-03-17 01:26:37.239127446 +0000 UTC m=+1022.871360424" watchObservedRunningTime="2026-03-17 01:26:37.239602938 +0000 UTC m=+1022.871835916" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.261376 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2gnd8" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.298487 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-r4z8g" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.321923 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" podStartSLOduration=3.489681818 podStartE2EDuration="40.321907835s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.432563049 +0000 UTC m=+985.064796027" lastFinishedPulling="2026-03-17 01:26:36.264789066 +0000 UTC m=+1021.897022044" observedRunningTime="2026-03-17 01:26:37.319232601 +0000 UTC m=+1022.951465579" watchObservedRunningTime="2026-03-17 01:26:37.321907835 +0000 UTC m=+1022.954140813" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.329419 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-t6nkw" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.339642 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" podStartSLOduration=3.498817342 podStartE2EDuration="40.339625898s" podCreationTimestamp="2026-03-17 01:25:57 +0000 UTC" firstStartedPulling="2026-03-17 01:25:59.427623109 +0000 UTC m=+985.059856087" lastFinishedPulling="2026-03-17 01:26:36.268431665 +0000 UTC m=+1021.900664643" observedRunningTime="2026-03-17 01:26:37.33729042 +0000 UTC m=+1022.969523398" watchObservedRunningTime="2026-03-17 01:26:37.339625898 +0000 UTC m=+1022.971858876" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.387188 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fzxpc" podStartSLOduration=5.145612708 podStartE2EDuration="8.387173157s" podCreationTimestamp="2026-03-17 01:26:29 +0000 UTC" firstStartedPulling="2026-03-17 01:26:33.023125784 +0000 UTC m=+1018.655358762" lastFinishedPulling="2026-03-17 01:26:36.264686233 +0000 UTC m=+1021.896919211" observedRunningTime="2026-03-17 01:26:37.369692592 +0000 UTC m=+1023.001925570" watchObservedRunningTime="2026-03-17 01:26:37.387173157 +0000 UTC m=+1023.019406135" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.514585 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7cjd9" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.543834 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-gwwjh" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.716131 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kklq4" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.932202 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t8wv8" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.963344 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2knm" Mar 17 01:26:37 crc kubenswrapper[4735]: I0317 01:26:37.997003 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-257t8" Mar 17 01:26:38 crc kubenswrapper[4735]: I0317 01:26:38.045099 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wnncr" Mar 17 01:26:38 crc kubenswrapper[4735]: I0317 01:26:38.232051 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p9zdz" Mar 17 01:26:38 crc kubenswrapper[4735]: I0317 01:26:38.266063 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-txd6q" Mar 17 01:26:39 crc kubenswrapper[4735]: I0317 01:26:39.732120 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:39 crc kubenswrapper[4735]: I0317 01:26:39.732420 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:40 crc kubenswrapper[4735]: I0317 01:26:40.316301 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-576dc457f-z87tj" Mar 17 01:26:40 crc kubenswrapper[4735]: I0317 01:26:40.777465 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fzxpc" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="registry-server" probeResult="failure" output=< Mar 17 01:26:40 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:26:40 crc kubenswrapper[4735]: > Mar 17 01:26:43 crc kubenswrapper[4735]: I0317 01:26:43.734770 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:43 crc kubenswrapper[4735]: I0317 01:26:43.735188 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:44 crc kubenswrapper[4735]: I0317 01:26:44.775744 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qdj6f" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="registry-server" probeResult="failure" output=< Mar 17 01:26:44 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:26:44 crc kubenswrapper[4735]: > Mar 17 01:26:47 crc kubenswrapper[4735]: I0317 01:26:47.149118 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2tj58" Mar 17 01:26:47 crc kubenswrapper[4735]: I0317 01:26:47.376367 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vm55l" Mar 17 01:26:47 crc kubenswrapper[4735]: I0317 01:26:47.591137 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4dmmd" Mar 17 01:26:47 crc kubenswrapper[4735]: I0317 01:26:47.697908 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-lhsm8" Mar 17 01:26:49 crc kubenswrapper[4735]: I0317 01:26:49.104846 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-v78gc" Mar 17 01:26:49 crc kubenswrapper[4735]: I0317 01:26:49.678194 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fwv89" Mar 17 01:26:49 crc kubenswrapper[4735]: I0317 01:26:49.780888 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:49 crc kubenswrapper[4735]: I0317 01:26:49.836292 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:50 crc kubenswrapper[4735]: I0317 01:26:50.009649 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.285501 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fzxpc" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="registry-server" containerID="cri-o://48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605" gracePeriod=2 Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.700176 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.759967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft6xh\" (UniqueName: \"kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh\") pod \"99054d59-96b8-4888-9e8d-342e9425ab75\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.760031 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities\") pod \"99054d59-96b8-4888-9e8d-342e9425ab75\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.760072 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content\") pod \"99054d59-96b8-4888-9e8d-342e9425ab75\" (UID: \"99054d59-96b8-4888-9e8d-342e9425ab75\") " Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.760890 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities" (OuterVolumeSpecName: "utilities") pod "99054d59-96b8-4888-9e8d-342e9425ab75" (UID: "99054d59-96b8-4888-9e8d-342e9425ab75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.765770 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh" (OuterVolumeSpecName: "kube-api-access-ft6xh") pod "99054d59-96b8-4888-9e8d-342e9425ab75" (UID: "99054d59-96b8-4888-9e8d-342e9425ab75"). InnerVolumeSpecName "kube-api-access-ft6xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.807727 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99054d59-96b8-4888-9e8d-342e9425ab75" (UID: "99054d59-96b8-4888-9e8d-342e9425ab75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.862044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft6xh\" (UniqueName: \"kubernetes.io/projected/99054d59-96b8-4888-9e8d-342e9425ab75-kube-api-access-ft6xh\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.862078 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:51 crc kubenswrapper[4735]: I0317 01:26:51.862089 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99054d59-96b8-4888-9e8d-342e9425ab75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.296190 4735 generic.go:334] "Generic (PLEG): container finished" podID="99054d59-96b8-4888-9e8d-342e9425ab75" containerID="48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605" exitCode=0 Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.296247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerDied","Data":"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605"} Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.297097 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzxpc" event={"ID":"99054d59-96b8-4888-9e8d-342e9425ab75","Type":"ContainerDied","Data":"c64ca0d86c5736a26c2e4c1f453d94f93a0b3969f7211ec06d40b94e82ef2791"} Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.297201 4735 scope.go:117] "RemoveContainer" containerID="48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.296285 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzxpc" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.323044 4735 scope.go:117] "RemoveContainer" containerID="d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.331049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.349004 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fzxpc"] Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.369344 4735 scope.go:117] "RemoveContainer" containerID="af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.392642 4735 scope.go:117] "RemoveContainer" containerID="48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605" Mar 17 01:26:52 crc kubenswrapper[4735]: E0317 01:26:52.393170 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605\": container with ID starting with 48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605 not found: ID does not exist" containerID="48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.393207 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605"} err="failed to get container status \"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605\": rpc error: code = NotFound desc = could not find container \"48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605\": container with ID starting with 48dc3200157bb3c483a06a3c6a4c2cba6741fbea4e68777dc98809b19fc24605 not found: ID does not exist" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.393231 4735 scope.go:117] "RemoveContainer" containerID="d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2" Mar 17 01:26:52 crc kubenswrapper[4735]: E0317 01:26:52.393448 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2\": container with ID starting with d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2 not found: ID does not exist" containerID="d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.393475 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2"} err="failed to get container status \"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2\": rpc error: code = NotFound desc = could not find container \"d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2\": container with ID starting with d51d796204224ef742d9da5131e11785a3205c6be749a218aca16ef7584d45f2 not found: ID does not exist" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.393491 4735 scope.go:117] "RemoveContainer" containerID="af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407" Mar 17 01:26:52 crc kubenswrapper[4735]: E0317 01:26:52.393721 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407\": container with ID starting with af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407 not found: ID does not exist" containerID="af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407" Mar 17 01:26:52 crc kubenswrapper[4735]: I0317 01:26:52.393756 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407"} err="failed to get container status \"af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407\": rpc error: code = NotFound desc = could not find container \"af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407\": container with ID starting with af9cae8c98706ecae693f3a662fad608cc301ba28291d02729453d3ef2ea7407 not found: ID does not exist" Mar 17 01:26:53 crc kubenswrapper[4735]: I0317 01:26:53.084738 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" path="/var/lib/kubelet/pods/99054d59-96b8-4888-9e8d-342e9425ab75/volumes" Mar 17 01:26:53 crc kubenswrapper[4735]: I0317 01:26:53.810988 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:53 crc kubenswrapper[4735]: I0317 01:26:53.871272 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.408221 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.408593 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qdj6f" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="registry-server" containerID="cri-o://121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72" gracePeriod=2 Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.793001 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.826439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content\") pod \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.826478 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities\") pod \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.826578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhcth\" (UniqueName: \"kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth\") pod \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\" (UID: \"feb4d4e5-39d8-42c5-83b7-d73cfba6095d\") " Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.827539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities" (OuterVolumeSpecName: "utilities") pod "feb4d4e5-39d8-42c5-83b7-d73cfba6095d" (UID: "feb4d4e5-39d8-42c5-83b7-d73cfba6095d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.836291 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth" (OuterVolumeSpecName: "kube-api-access-jhcth") pod "feb4d4e5-39d8-42c5-83b7-d73cfba6095d" (UID: "feb4d4e5-39d8-42c5-83b7-d73cfba6095d"). InnerVolumeSpecName "kube-api-access-jhcth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.852800 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feb4d4e5-39d8-42c5-83b7-d73cfba6095d" (UID: "feb4d4e5-39d8-42c5-83b7-d73cfba6095d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.928684 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.928729 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:55 crc kubenswrapper[4735]: I0317 01:26:55.928749 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhcth\" (UniqueName: \"kubernetes.io/projected/feb4d4e5-39d8-42c5-83b7-d73cfba6095d-kube-api-access-jhcth\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.331063 4735 generic.go:334] "Generic (PLEG): container finished" podID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerID="121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72" exitCode=0 Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.331105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerDied","Data":"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72"} Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.331131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdj6f" event={"ID":"feb4d4e5-39d8-42c5-83b7-d73cfba6095d","Type":"ContainerDied","Data":"03108a9bff5ca86c76c3e1a6efd8d886f33a9d78660c24a61af3a594c03ec0a7"} Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.331147 4735 scope.go:117] "RemoveContainer" containerID="121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.331296 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdj6f" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.388996 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.402808 4735 scope.go:117] "RemoveContainer" containerID="e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.404103 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdj6f"] Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.436590 4735 scope.go:117] "RemoveContainer" containerID="235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.461196 4735 scope.go:117] "RemoveContainer" containerID="121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72" Mar 17 01:26:56 crc kubenswrapper[4735]: E0317 01:26:56.461664 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72\": container with ID starting with 121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72 not found: ID does not exist" containerID="121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.461708 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72"} err="failed to get container status \"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72\": rpc error: code = NotFound desc = could not find container \"121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72\": container with ID starting with 121d919673b6c6f2f781050112c413c43ba23fe975e5695c76c4b3746ba0bf72 not found: ID does not exist" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.461734 4735 scope.go:117] "RemoveContainer" containerID="e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4" Mar 17 01:26:56 crc kubenswrapper[4735]: E0317 01:26:56.462117 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4\": container with ID starting with e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4 not found: ID does not exist" containerID="e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.462148 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4"} err="failed to get container status \"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4\": rpc error: code = NotFound desc = could not find container \"e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4\": container with ID starting with e7ea488f4cc2e45786b03efc1b2e488f63601560e261eca52f72727bd73e87f4 not found: ID does not exist" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.462169 4735 scope.go:117] "RemoveContainer" containerID="235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3" Mar 17 01:26:56 crc kubenswrapper[4735]: E0317 01:26:56.462435 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3\": container with ID starting with 235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3 not found: ID does not exist" containerID="235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3" Mar 17 01:26:56 crc kubenswrapper[4735]: I0317 01:26:56.462555 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3"} err="failed to get container status \"235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3\": rpc error: code = NotFound desc = could not find container \"235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3\": container with ID starting with 235425bf37f471236c10b67cbca9fb60fc79093b3c2d43ce6b6a875bece266a3 not found: ID does not exist" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.081655 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" path="/var/lib/kubelet/pods/feb4d4e5-39d8-42c5-83b7-d73cfba6095d/volumes" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.820876 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821126 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821138 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821147 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="extract-content" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821153 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="extract-content" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821159 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890c33ed-65c3-4f76-9da9-99b3f3e8ef33" containerName="oc" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821165 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="890c33ed-65c3-4f76-9da9-99b3f3e8ef33" containerName="oc" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821178 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="extract-utilities" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="extract-utilities" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821197 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821203 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821211 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="extract-utilities" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821217 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="extract-utilities" Mar 17 01:26:57 crc kubenswrapper[4735]: E0317 01:26:57.821226 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="extract-content" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821232 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="extract-content" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821358 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="890c33ed-65c3-4f76-9da9-99b3f3e8ef33" containerName="oc" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821370 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="99054d59-96b8-4888-9e8d-342e9425ab75" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.821388 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb4d4e5-39d8-42c5-83b7-d73cfba6095d" containerName="registry-server" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.822280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.841924 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.858671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.858746 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.858794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h8rs\" (UniqueName: \"kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.960097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.960737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.960896 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.961191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.961253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h8rs\" (UniqueName: \"kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:57 crc kubenswrapper[4735]: I0317 01:26:57.980793 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h8rs\" (UniqueName: \"kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs\") pod \"redhat-operators-7wt25\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:58 crc kubenswrapper[4735]: I0317 01:26:58.140261 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:26:58 crc kubenswrapper[4735]: I0317 01:26:58.621983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:26:59 crc kubenswrapper[4735]: I0317 01:26:59.352563 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerID="a7779779a85721143bf17539145a858de9870c35c9550e7b15054de532348819" exitCode=0 Mar 17 01:26:59 crc kubenswrapper[4735]: I0317 01:26:59.352708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerDied","Data":"a7779779a85721143bf17539145a858de9870c35c9550e7b15054de532348819"} Mar 17 01:26:59 crc kubenswrapper[4735]: I0317 01:26:59.354349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerStarted","Data":"b5f69490b43f1463dab7dd02a1177a624bf9042265d00756c03a95c524843b74"} Mar 17 01:27:01 crc kubenswrapper[4735]: I0317 01:27:01.368661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerDied","Data":"d2b3bc96f86df9db28e5a2c9f64c96106e5f505dd0edb7effe9f23cc6d083a67"} Mar 17 01:27:01 crc kubenswrapper[4735]: I0317 01:27:01.369298 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerID="d2b3bc96f86df9db28e5a2c9f64c96106e5f505dd0edb7effe9f23cc6d083a67" exitCode=0 Mar 17 01:27:03 crc kubenswrapper[4735]: I0317 01:27:03.390410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerStarted","Data":"6de41bc3c43351f45bb5f7fe34d56c08d327e112939d7e0765de7f778abcefe4"} Mar 17 01:27:03 crc kubenswrapper[4735]: I0317 01:27:03.411945 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7wt25" podStartSLOduration=3.176191696 podStartE2EDuration="6.411921023s" podCreationTimestamp="2026-03-17 01:26:57 +0000 UTC" firstStartedPulling="2026-03-17 01:26:59.353980818 +0000 UTC m=+1044.986213796" lastFinishedPulling="2026-03-17 01:27:02.589710145 +0000 UTC m=+1048.221943123" observedRunningTime="2026-03-17 01:27:03.410204322 +0000 UTC m=+1049.042437300" watchObservedRunningTime="2026-03-17 01:27:03.411921023 +0000 UTC m=+1049.044154041" Mar 17 01:27:08 crc kubenswrapper[4735]: I0317 01:27:08.140641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:08 crc kubenswrapper[4735]: I0317 01:27:08.140960 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.090599 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.092017 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.093826 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.093978 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-v54m8" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.093926 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.094342 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.103775 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.166520 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.168483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.174261 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.179499 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.229669 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.229787 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nml7c\" (UniqueName: \"kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.229804 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7wt25" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="registry-server" probeResult="failure" output=< Mar 17 01:27:09 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:27:09 crc kubenswrapper[4735]: > Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.331202 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nml7c\" (UniqueName: \"kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.331244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.331290 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.331432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jghg\" (UniqueName: \"kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.331553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.332310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.348527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nml7c\" (UniqueName: \"kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c\") pod \"dnsmasq-dns-7b95c5c449-6m592\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.420403 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.433267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.433327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.433355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jghg\" (UniqueName: \"kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.434223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.434270 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.456587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jghg\" (UniqueName: \"kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg\") pod \"dnsmasq-dns-bd9cf7445-nzl4x\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.527945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:09 crc kubenswrapper[4735]: I0317 01:27:09.898274 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:10 crc kubenswrapper[4735]: I0317 01:27:10.046108 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:10 crc kubenswrapper[4735]: W0317 01:27:10.051130 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb661cb5_2847_480f_9703_941bab7097d3.slice/crio-1fcc600532bca908d9b20b0081ea11e4d8abd0b6b5064324b623c05b122d996a WatchSource:0}: Error finding container 1fcc600532bca908d9b20b0081ea11e4d8abd0b6b5064324b623c05b122d996a: Status 404 returned error can't find the container with id 1fcc600532bca908d9b20b0081ea11e4d8abd0b6b5064324b623c05b122d996a Mar 17 01:27:10 crc kubenswrapper[4735]: I0317 01:27:10.437326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" event={"ID":"f44f7503-90aa-4ac1-b470-6b122a53d291","Type":"ContainerStarted","Data":"1a910f4d751fe236422e52e99bda22cc84887860502fd2283f2fbbc42c6228cd"} Mar 17 01:27:10 crc kubenswrapper[4735]: I0317 01:27:10.439500 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" event={"ID":"cb661cb5-2847-480f-9703-941bab7097d3","Type":"ContainerStarted","Data":"1fcc600532bca908d9b20b0081ea11e4d8abd0b6b5064324b623c05b122d996a"} Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.332655 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.365910 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.369961 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.383546 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.493646 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.493704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdhk\" (UniqueName: \"kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.493766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.595067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdhk\" (UniqueName: \"kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.595107 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.595245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.596630 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.601283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.648381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdhk\" (UniqueName: \"kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk\") pod \"dnsmasq-dns-dcf85566c-m8vnj\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:12 crc kubenswrapper[4735]: I0317 01:27:12.703828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.110936 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.293909 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.335163 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.336244 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.351297 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.481506 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" event={"ID":"f0ff000b-fe0a-4dfc-b976-92c57cf5595a","Type":"ContainerStarted","Data":"52c552fb04d515ae54ae648570a82c16f1c7afe8d60adbfc3e0f1cb65ff05c1d"} Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.507410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.507475 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptttp\" (UniqueName: \"kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.507514 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.608205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.608302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.608335 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptttp\" (UniqueName: \"kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.610012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.610268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.628010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptttp\" (UniqueName: \"kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp\") pod \"dnsmasq-dns-86545856d7-wwdxv\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.666810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.861136 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.862431 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870110 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870287 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lng5c" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870460 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870591 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870692 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870793 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.870906 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 01:27:13 crc kubenswrapper[4735]: I0317 01:27:13.879664 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.033543 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.033791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.033876 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwrf\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034181 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034214 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034239 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.034263 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.135486 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwrf\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.137361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.137452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.137608 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.137739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.138555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.152701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.161629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.162219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.162278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwrf\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.163027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.168699 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.215415 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.220316 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.347021 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.439660 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.441090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444239 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5slxq" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444285 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444415 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444500 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444599 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444634 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444727 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.444754 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.530165 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" event={"ID":"d05a6355-3f32-4625-a6d4-9c0dbec18cb0","Type":"ContainerStarted","Data":"39c46b2e6a167aee8ed2b26e317a7b138063beb15176f740ceecc527ad884ba4"} Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.542926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.542971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6dn\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543075 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.543177 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6dn\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647466 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647655 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.647903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.649220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.649393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.649493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.649823 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.651123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.661092 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.663413 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.663477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.663947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.664038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6dn\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.710173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.778758 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.829956 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:27:14 crc kubenswrapper[4735]: W0317 01:27:14.900939 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4bda0b8_ed44_4576_a404_b25cc7f8ea6e.slice/crio-cdd76d1b426c3fbc6f598e8cb35649494b662f5888648ef503cc97d391974fd8 WatchSource:0}: Error finding container cdd76d1b426c3fbc6f598e8cb35649494b662f5888648ef503cc97d391974fd8: Status 404 returned error can't find the container with id cdd76d1b426c3fbc6f598e8cb35649494b662f5888648ef503cc97d391974fd8 Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.952982 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.954236 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.961965 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.962127 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cm5ng" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.963274 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.963524 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.965740 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 17 01:27:14 crc kubenswrapper[4735]: I0317 01:27:14.966995 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.064576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.064852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.064894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.064923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.064953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.065307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.065343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prmdd\" (UniqueName: \"kubernetes.io/projected/479dabcd-a158-4f12-9baf-77981001d3b3-kube-api-access-prmdd\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.065359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.153929 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.155367 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173330 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173387 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173435 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prmdd\" (UniqueName: \"kubernetes.io/projected/479dabcd-a158-4f12-9baf-77981001d3b3-kube-api-access-prmdd\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.173798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.174586 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.176561 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.177248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.180295 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.185643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/479dabcd-a158-4f12-9baf-77981001d3b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.187537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.204465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/479dabcd-a158-4f12-9baf-77981001d3b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.215093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prmdd\" (UniqueName: \"kubernetes.io/projected/479dabcd-a158-4f12-9baf-77981001d3b3-kube-api-access-prmdd\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.245564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"479dabcd-a158-4f12-9baf-77981001d3b3\") " pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.275135 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.275195 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.275324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjkzl\" (UniqueName: \"kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.285881 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.376493 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.376555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.376592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjkzl\" (UniqueName: \"kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.377570 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.377818 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.396471 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjkzl\" (UniqueName: \"kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl\") pod \"community-operators-vh5d5\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.461004 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.472417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:15 crc kubenswrapper[4735]: W0317 01:27:15.472673 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9ecd2f_1a7b_4c45_b722_c88f25b27487.slice/crio-09406213dee28daa5ca7c222fa5a3305f77b41006592b35ab4ddc9c388636962 WatchSource:0}: Error finding container 09406213dee28daa5ca7c222fa5a3305f77b41006592b35ab4ddc9c388636962: Status 404 returned error can't find the container with id 09406213dee28daa5ca7c222fa5a3305f77b41006592b35ab4ddc9c388636962 Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.571445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerStarted","Data":"cdd76d1b426c3fbc6f598e8cb35649494b662f5888648ef503cc97d391974fd8"} Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.614170 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerStarted","Data":"09406213dee28daa5ca7c222fa5a3305f77b41006592b35ab4ddc9c388636962"} Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.884886 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.886083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.926260 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.959597 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.959810 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.960424 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-tzrxz" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.960559 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.987327 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989326 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8xw\" (UniqueName: \"kubernetes.io/projected/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kube-api-access-dd8xw\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:15 crc kubenswrapper[4735]: I0317 01:27:15.989511 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: W0317 01:27:16.014073 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479dabcd_a158_4f12_9baf_77981001d3b3.slice/crio-8775f91affdf30034f58d87ea7fd2c6f0975a91acd2b02356f05ae632c76b254 WatchSource:0}: Error finding container 8775f91affdf30034f58d87ea7fd2c6f0975a91acd2b02356f05ae632c76b254: Status 404 returned error can't find the container with id 8775f91affdf30034f58d87ea7fd2c6f0975a91acd2b02356f05ae632c76b254 Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091563 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8xw\" (UniqueName: \"kubernetes.io/projected/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kube-api-access-dd8xw\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091721 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091778 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.091794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.092235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.094490 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.100198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.100729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.100955 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.114457 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.115756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632237b7-0f5e-426d-ae9e-e434ac0e1da6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.140471 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8xw\" (UniqueName: \"kubernetes.io/projected/632237b7-0f5e-426d-ae9e-e434ac0e1da6-kube-api-access-dd8xw\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.180928 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"632237b7-0f5e-426d-ae9e-e434ac0e1da6\") " pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.273338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.310197 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.583176 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.584005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.588104 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.588158 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-478kt" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.588331 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.609344 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.671530 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerStarted","Data":"d01eea389788f35ed5843477937abb0fc3eff8c1185f0d9fbdd8eb0ec8c4f046"} Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.713226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"479dabcd-a158-4f12-9baf-77981001d3b3","Type":"ContainerStarted","Data":"8775f91affdf30034f58d87ea7fd2c6f0975a91acd2b02356f05ae632c76b254"} Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.713816 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-kolla-config\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.713901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.714049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.714223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7n84\" (UniqueName: \"kubernetes.io/projected/a6d7635e-1b43-4bd6-af6d-234b9502545e-kube-api-access-j7n84\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.714336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-config-data\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.815705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7n84\" (UniqueName: \"kubernetes.io/projected/a6d7635e-1b43-4bd6-af6d-234b9502545e-kube-api-access-j7n84\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.815986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-config-data\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.816031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-kolla-config\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.816068 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.816135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.817693 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-config-data\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.817865 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6d7635e-1b43-4bd6-af6d-234b9502545e-kolla-config\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.822409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.832257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7n84\" (UniqueName: \"kubernetes.io/projected/a6d7635e-1b43-4bd6-af6d-234b9502545e-kube-api-access-j7n84\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.839384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d7635e-1b43-4bd6-af6d-234b9502545e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6d7635e-1b43-4bd6-af6d-234b9502545e\") " pod="openstack/memcached-0" Mar 17 01:27:16 crc kubenswrapper[4735]: I0317 01:27:16.917065 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 01:27:17 crc kubenswrapper[4735]: W0317 01:27:17.229137 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod632237b7_0f5e_426d_ae9e_e434ac0e1da6.slice/crio-b7c1bec546877368544494da6d5b217e9a7b9780cdaedf8d5558009072de3afd WatchSource:0}: Error finding container b7c1bec546877368544494da6d5b217e9a7b9780cdaedf8d5558009072de3afd: Status 404 returned error can't find the container with id b7c1bec546877368544494da6d5b217e9a7b9780cdaedf8d5558009072de3afd Mar 17 01:27:17 crc kubenswrapper[4735]: I0317 01:27:17.236544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 01:27:17 crc kubenswrapper[4735]: I0317 01:27:17.711586 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 01:27:17 crc kubenswrapper[4735]: I0317 01:27:17.732728 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"632237b7-0f5e-426d-ae9e-e434ac0e1da6","Type":"ContainerStarted","Data":"b7c1bec546877368544494da6d5b217e9a7b9780cdaedf8d5558009072de3afd"} Mar 17 01:27:17 crc kubenswrapper[4735]: I0317 01:27:17.737823 4735 generic.go:334] "Generic (PLEG): container finished" podID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerID="74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5" exitCode=0 Mar 17 01:27:17 crc kubenswrapper[4735]: I0317 01:27:17.737890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerDied","Data":"74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5"} Mar 17 01:27:17 crc kubenswrapper[4735]: W0317 01:27:17.820163 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d7635e_1b43_4bd6_af6d_234b9502545e.slice/crio-e47ad7148328d41a80a8384ac06f26259eed2f7611b0d4a6895ae1661a4303a0 WatchSource:0}: Error finding container e47ad7148328d41a80a8384ac06f26259eed2f7611b0d4a6895ae1661a4303a0: Status 404 returned error can't find the container with id e47ad7148328d41a80a8384ac06f26259eed2f7611b0d4a6895ae1661a4303a0 Mar 17 01:27:18 crc kubenswrapper[4735]: I0317 01:27:18.235489 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:18 crc kubenswrapper[4735]: I0317 01:27:18.352955 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:18 crc kubenswrapper[4735]: I0317 01:27:18.762621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6d7635e-1b43-4bd6-af6d-234b9502545e","Type":"ContainerStarted","Data":"e47ad7148328d41a80a8384ac06f26259eed2f7611b0d4a6895ae1661a4303a0"} Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.071266 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.077340 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.093658 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfq92\" (UniqueName: \"kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92\") pod \"kube-state-metrics-0\" (UID: \"1dfe3748-6604-4210-a284-1f0bfdcdf01f\") " pod="openstack/kube-state-metrics-0" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.107884 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bdthr" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.141579 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.196561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfq92\" (UniqueName: \"kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92\") pod \"kube-state-metrics-0\" (UID: \"1dfe3748-6604-4210-a284-1f0bfdcdf01f\") " pod="openstack/kube-state-metrics-0" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.227128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfq92\" (UniqueName: \"kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92\") pod \"kube-state-metrics-0\" (UID: \"1dfe3748-6604-4210-a284-1f0bfdcdf01f\") " pod="openstack/kube-state-metrics-0" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.403676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.500713 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.784757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerStarted","Data":"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025"} Mar 17 01:27:19 crc kubenswrapper[4735]: I0317 01:27:19.785013 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7wt25" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="registry-server" containerID="cri-o://6de41bc3c43351f45bb5f7fe34d56c08d327e112939d7e0765de7f778abcefe4" gracePeriod=2 Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.003601 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:27:20 crc kubenswrapper[4735]: W0317 01:27:20.031848 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dfe3748_6604_4210_a284_1f0bfdcdf01f.slice/crio-32b0ac083d8e54fa7f4f1558f28d94789387877e191df2670f38fe2db9c0644b WatchSource:0}: Error finding container 32b0ac083d8e54fa7f4f1558f28d94789387877e191df2670f38fe2db9c0644b: Status 404 returned error can't find the container with id 32b0ac083d8e54fa7f4f1558f28d94789387877e191df2670f38fe2db9c0644b Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.819215 4735 generic.go:334] "Generic (PLEG): container finished" podID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerID="cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025" exitCode=0 Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.820168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerDied","Data":"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025"} Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.829044 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerID="6de41bc3c43351f45bb5f7fe34d56c08d327e112939d7e0765de7f778abcefe4" exitCode=0 Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.829112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerDied","Data":"6de41bc3c43351f45bb5f7fe34d56c08d327e112939d7e0765de7f778abcefe4"} Mar 17 01:27:20 crc kubenswrapper[4735]: I0317 01:27:20.840647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1dfe3748-6604-4210-a284-1f0bfdcdf01f","Type":"ContainerStarted","Data":"32b0ac083d8e54fa7f4f1558f28d94789387877e191df2670f38fe2db9c0644b"} Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.922795 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9ngz2"] Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.923893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.930528 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bdq8q" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.930872 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.931340 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.946901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-log-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.946932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d35e229-2eeb-4843-a9a2-763156affef5-scripts\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.946949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.946979 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-combined-ca-bundle\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.947015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znftq\" (UniqueName: \"kubernetes.io/projected/8d35e229-2eeb-4843-a9a2-763156affef5-kube-api-access-znftq\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.947061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-ovn-controller-tls-certs\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.947091 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:21 crc kubenswrapper[4735]: I0317 01:27:21.984649 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ffshg"] Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:21.998597 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.030284 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9ngz2"] Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-log-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049344 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d35e229-2eeb-4843-a9a2-763156affef5-scripts\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049409 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92e2c9c-efdb-43cf-9a17-cde88240c670-scripts\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-combined-ca-bundle\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049471 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-lib\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znftq\" (UniqueName: \"kubernetes.io/projected/8d35e229-2eeb-4843-a9a2-763156affef5-kube-api-access-znftq\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-run\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-etc-ovs\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh272\" (UniqueName: \"kubernetes.io/projected/e92e2c9c-efdb-43cf-9a17-cde88240c670-kube-api-access-jh272\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-log\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-ovn-controller-tls-certs\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.049662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.050788 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.051157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-log-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.052436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d35e229-2eeb-4843-a9a2-763156affef5-var-run-ovn\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.057875 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d35e229-2eeb-4843-a9a2-763156affef5-scripts\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.061553 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ffshg"] Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.065563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-ovn-controller-tls-certs\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.068531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35e229-2eeb-4843-a9a2-763156affef5-combined-ca-bundle\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.069941 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znftq\" (UniqueName: \"kubernetes.io/projected/8d35e229-2eeb-4843-a9a2-763156affef5-kube-api-access-znftq\") pod \"ovn-controller-9ngz2\" (UID: \"8d35e229-2eeb-4843-a9a2-763156affef5\") " pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92e2c9c-efdb-43cf-9a17-cde88240c670-scripts\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161397 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-lib\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-run\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-etc-ovs\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh272\" (UniqueName: \"kubernetes.io/projected/e92e2c9c-efdb-43cf-9a17-cde88240c670-kube-api-access-jh272\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.161555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-log\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.163341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-log\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.163699 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-run\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.164284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-etc-ovs\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.164543 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e92e2c9c-efdb-43cf-9a17-cde88240c670-var-lib\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.174407 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92e2c9c-efdb-43cf-9a17-cde88240c670-scripts\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.183084 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh272\" (UniqueName: \"kubernetes.io/projected/e92e2c9c-efdb-43cf-9a17-cde88240c670-kube-api-access-jh272\") pod \"ovn-controller-ovs-ffshg\" (UID: \"e92e2c9c-efdb-43cf-9a17-cde88240c670\") " pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.309194 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:22 crc kubenswrapper[4735]: I0317 01:27:22.331584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.264665 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.266401 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.268384 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.268533 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.269789 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zngmh" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.269963 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.276132 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.279230 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.420636 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.420692 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.420910 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.421243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.421289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.421317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.421392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7wp\" (UniqueName: \"kubernetes.io/projected/e8507883-2fad-4c5c-90c9-58eb17711bb3-kube-api-access-6s7wp\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.421502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.440345 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.442582 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.451006 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.454061 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.454567 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.455080 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nfbp4" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.455409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529588 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529606 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.529653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.530043 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.530080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7wp\" (UniqueName: \"kubernetes.io/projected/e8507883-2fad-4c5c-90c9-58eb17711bb3-kube-api-access-6s7wp\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.530156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.530734 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.530963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.531484 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8507883-2fad-4c5c-90c9-58eb17711bb3-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.536771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.540307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.543339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8507883-2fad-4c5c-90c9-58eb17711bb3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.551139 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7wp\" (UniqueName: \"kubernetes.io/projected/e8507883-2fad-4c5c-90c9-58eb17711bb3-kube-api-access-6s7wp\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.621580 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8507883-2fad-4c5c-90c9-58eb17711bb3\") " pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.631893 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.631944 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.631975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cjs\" (UniqueName: \"kubernetes.io/projected/e6be8d2b-aab5-415b-bb98-298563e9f719-kube-api-access-p2cjs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.631999 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.632033 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.632055 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.632071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.632460 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.733958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734027 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cjs\" (UniqueName: \"kubernetes.io/projected/e6be8d2b-aab5-415b-bb98-298563e9f719-kube-api-access-p2cjs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.734937 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.735237 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.735769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6be8d2b-aab5-415b-bb98-298563e9f719-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.736009 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.739293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.739845 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.741088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6be8d2b-aab5-415b-bb98-298563e9f719-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.754097 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cjs\" (UniqueName: \"kubernetes.io/projected/e6be8d2b-aab5-415b-bb98-298563e9f719-kube-api-access-p2cjs\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.759042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6be8d2b-aab5-415b-bb98-298563e9f719\") " pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.764186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:25 crc kubenswrapper[4735]: I0317 01:27:25.890996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.715188 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.851014 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h8rs\" (UniqueName: \"kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs\") pod \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.851165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities\") pod \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.851236 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content\") pod \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\" (UID: \"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9\") " Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.851796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities" (OuterVolumeSpecName: "utilities") pod "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" (UID: "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.859985 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs" (OuterVolumeSpecName: "kube-api-access-9h8rs") pod "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" (UID: "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9"). InnerVolumeSpecName "kube-api-access-9h8rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.919077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wt25" event={"ID":"9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9","Type":"ContainerDied","Data":"b5f69490b43f1463dab7dd02a1177a624bf9042265d00756c03a95c524843b74"} Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.919126 4735 scope.go:117] "RemoveContainer" containerID="6de41bc3c43351f45bb5f7fe34d56c08d327e112939d7e0765de7f778abcefe4" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.919245 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wt25" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.953542 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h8rs\" (UniqueName: \"kubernetes.io/projected/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-kube-api-access-9h8rs\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.953570 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:26 crc kubenswrapper[4735]: I0317 01:27:26.959293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" (UID: "9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:27:27 crc kubenswrapper[4735]: I0317 01:27:27.054817 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:27 crc kubenswrapper[4735]: I0317 01:27:27.240371 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:27:27 crc kubenswrapper[4735]: I0317 01:27:27.247465 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7wt25"] Mar 17 01:27:29 crc kubenswrapper[4735]: I0317 01:27:29.086278 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" path="/var/lib/kubelet/pods/9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9/volumes" Mar 17 01:27:31 crc kubenswrapper[4735]: I0317 01:27:31.787071 4735 scope.go:117] "RemoveContainer" containerID="d2b3bc96f86df9db28e5a2c9f64c96106e5f505dd0edb7effe9f23cc6d083a67" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.689897 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.690505 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.690703 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmwrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(a4bda0b8-ed44-4576-a404-b25cc7f8ea6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.692309 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.719063 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.719118 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.719237 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hq6dn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(cb9ecd2f-1a7b-4c45-b722-c88f25b27487): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.720574 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.730686 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.730768 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.730942 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:e43235cb19da04699a53f42b6a75afe9,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd8xw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(632237b7-0f5e-426d-ae9e-e434ac0e1da6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:39 crc kubenswrapper[4735]: E0317 01:27:39.733063 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="632237b7-0f5e-426d-ae9e-e434ac0e1da6" Mar 17 01:27:40 crc kubenswrapper[4735]: E0317 01:27:40.014517 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="632237b7-0f5e-426d-ae9e-e434ac0e1da6" Mar 17 01:27:40 crc kubenswrapper[4735]: E0317 01:27:40.014896 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/rabbitmq-server-0" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" Mar 17 01:27:40 crc kubenswrapper[4735]: E0317 01:27:40.014953 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" Mar 17 01:27:40 crc kubenswrapper[4735]: I0317 01:27:40.376005 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 01:27:40 crc kubenswrapper[4735]: I0317 01:27:40.992933 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 01:27:41 crc kubenswrapper[4735]: I0317 01:27:41.286667 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9ngz2"] Mar 17 01:27:41 crc kubenswrapper[4735]: I0317 01:27:41.622825 4735 scope.go:117] "RemoveContainer" containerID="a7779779a85721143bf17539145a858de9870c35c9550e7b15054de532348819" Mar 17 01:27:41 crc kubenswrapper[4735]: W0317 01:27:41.635155 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6be8d2b_aab5_415b_bb98_298563e9f719.slice/crio-85cc6cfb6e8ea081129c5eb353ab1429ec8df6d75b966f2185afd5486e3ac61c WatchSource:0}: Error finding container 85cc6cfb6e8ea081129c5eb353ab1429ec8df6d75b966f2185afd5486e3ac61c: Status 404 returned error can't find the container with id 85cc6cfb6e8ea081129c5eb353ab1429ec8df6d75b966f2185afd5486e3ac61c Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.649827 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.649958 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.650088 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jghg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bd9cf7445-nzl4x_openstack(cb661cb5-2847-480f-9703-941bab7097d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.651272 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" podUID="cb661cb5-2847-480f-9703-941bab7097d3" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.658101 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.658132 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.658200 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nml7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7b95c5c449-6m592_openstack(f44f7503-90aa-4ac1-b470-6b122a53d291): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.659873 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" podUID="f44f7503-90aa-4ac1-b470-6b122a53d291" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.678068 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.678115 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.678238 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cdhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-dcf85566c-m8vnj_openstack(f0ff000b-fe0a-4dfc-b976-92c57cf5595a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:27:41 crc kubenswrapper[4735]: E0317 01:27:41.679462 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.040755 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6be8d2b-aab5-415b-bb98-298563e9f719","Type":"ContainerStarted","Data":"85cc6cfb6e8ea081129c5eb353ab1429ec8df6d75b966f2185afd5486e3ac61c"} Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.046703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8507883-2fad-4c5c-90c9-58eb17711bb3","Type":"ContainerStarted","Data":"9596625d22d4ded705df49505950183e08e2817a69f8adbaed5ad29ffb6f1744"} Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.050019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9ngz2" event={"ID":"8d35e229-2eeb-4843-a9a2-763156affef5","Type":"ContainerStarted","Data":"bb97e8702e342ca102b6e4540fe72d4d0ee9c1d4ca7707ff1bc50243a4791879"} Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.232506 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ffshg"] Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.576701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.582526 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.747870 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config\") pod \"f44f7503-90aa-4ac1-b470-6b122a53d291\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748212 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jghg\" (UniqueName: \"kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg\") pod \"cb661cb5-2847-480f-9703-941bab7097d3\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config\") pod \"cb661cb5-2847-480f-9703-941bab7097d3\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748285 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc\") pod \"cb661cb5-2847-480f-9703-941bab7097d3\" (UID: \"cb661cb5-2847-480f-9703-941bab7097d3\") " Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748301 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nml7c\" (UniqueName: \"kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c\") pod \"f44f7503-90aa-4ac1-b470-6b122a53d291\" (UID: \"f44f7503-90aa-4ac1-b470-6b122a53d291\") " Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config" (OuterVolumeSpecName: "config") pod "f44f7503-90aa-4ac1-b470-6b122a53d291" (UID: "f44f7503-90aa-4ac1-b470-6b122a53d291"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.748865 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config" (OuterVolumeSpecName: "config") pod "cb661cb5-2847-480f-9703-941bab7097d3" (UID: "cb661cb5-2847-480f-9703-941bab7097d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.749108 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb661cb5-2847-480f-9703-941bab7097d3" (UID: "cb661cb5-2847-480f-9703-941bab7097d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.753266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg" (OuterVolumeSpecName: "kube-api-access-7jghg") pod "cb661cb5-2847-480f-9703-941bab7097d3" (UID: "cb661cb5-2847-480f-9703-941bab7097d3"). InnerVolumeSpecName "kube-api-access-7jghg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.754931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c" (OuterVolumeSpecName: "kube-api-access-nml7c") pod "f44f7503-90aa-4ac1-b470-6b122a53d291" (UID: "f44f7503-90aa-4ac1-b470-6b122a53d291"). InnerVolumeSpecName "kube-api-access-nml7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.850377 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44f7503-90aa-4ac1-b470-6b122a53d291-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.850408 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jghg\" (UniqueName: \"kubernetes.io/projected/cb661cb5-2847-480f-9703-941bab7097d3-kube-api-access-7jghg\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.850419 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.850427 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb661cb5-2847-480f-9703-941bab7097d3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:42 crc kubenswrapper[4735]: I0317 01:27:42.850436 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nml7c\" (UniqueName: \"kubernetes.io/projected/f44f7503-90aa-4ac1-b470-6b122a53d291-kube-api-access-nml7c\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.068208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerStarted","Data":"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb"} Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.082737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ffshg" event={"ID":"e92e2c9c-efdb-43cf-9a17-cde88240c670","Type":"ContainerStarted","Data":"3710c373dfdc51000166d5bb3db71918743b24b8173efcfeb1dfd6d5f2862870"} Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.087067 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.087913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b95c5c449-6m592" event={"ID":"f44f7503-90aa-4ac1-b470-6b122a53d291","Type":"ContainerDied","Data":"1a910f4d751fe236422e52e99bda22cc84887860502fd2283f2fbbc42c6228cd"} Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.110939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" event={"ID":"cb661cb5-2847-480f-9703-941bab7097d3","Type":"ContainerDied","Data":"1fcc600532bca908d9b20b0081ea11e4d8abd0b6b5064324b623c05b122d996a"} Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.111018 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd9cf7445-nzl4x" Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.119072 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vh5d5" podStartSLOduration=4.018346629 podStartE2EDuration="28.119058967s" podCreationTimestamp="2026-03-17 01:27:15 +0000 UTC" firstStartedPulling="2026-03-17 01:27:17.74027045 +0000 UTC m=+1063.372503428" lastFinishedPulling="2026-03-17 01:27:41.840982788 +0000 UTC m=+1087.473215766" observedRunningTime="2026-03-17 01:27:43.117316104 +0000 UTC m=+1088.749549072" watchObservedRunningTime="2026-03-17 01:27:43.119058967 +0000 UTC m=+1088.751291935" Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.293478 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.300922 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bd9cf7445-nzl4x"] Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.342166 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:43 crc kubenswrapper[4735]: I0317 01:27:43.366769 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b95c5c449-6m592"] Mar 17 01:27:43 crc kubenswrapper[4735]: E0317 01:27:43.648054 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 17 01:27:43 crc kubenswrapper[4735]: E0317 01:27:43.648274 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 17 01:27:43 crc kubenswrapper[4735]: E0317 01:27:43.648397 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tfq92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(1dfe3748-6604-4210-a284-1f0bfdcdf01f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:27:43 crc kubenswrapper[4735]: E0317 01:27:43.650725 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.120151 4735 generic.go:334] "Generic (PLEG): container finished" podID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerID="abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b" exitCode=0 Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.120370 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" event={"ID":"d05a6355-3f32-4625-a6d4-9c0dbec18cb0","Type":"ContainerDied","Data":"abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b"} Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.128229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6d7635e-1b43-4bd6-af6d-234b9502545e","Type":"ContainerStarted","Data":"182fed5feed948529c771afee083d12bdf7a81f118f4279b16df09d926a1169c"} Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.128323 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.150733 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"479dabcd-a158-4f12-9baf-77981001d3b3","Type":"ContainerStarted","Data":"4bfae0953fc37f710f8622b2850595e89cf59f8e165ed44ad84f9ed755ba23fa"} Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.153279 4735 generic.go:334] "Generic (PLEG): container finished" podID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerID="d3a33f154c8b57b81758cc751cc78e3f8d0ce52df0d78d8d12312cd69776a736" exitCode=0 Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.153689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" event={"ID":"f0ff000b-fe0a-4dfc-b976-92c57cf5595a","Type":"ContainerDied","Data":"d3a33f154c8b57b81758cc751cc78e3f8d0ce52df0d78d8d12312cd69776a736"} Mar 17 01:27:44 crc kubenswrapper[4735]: E0317 01:27:44.156455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" Mar 17 01:27:44 crc kubenswrapper[4735]: I0317 01:27:44.169258 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.925670604 podStartE2EDuration="28.169244397s" podCreationTimestamp="2026-03-17 01:27:16 +0000 UTC" firstStartedPulling="2026-03-17 01:27:17.874935945 +0000 UTC m=+1063.507168933" lastFinishedPulling="2026-03-17 01:27:42.118509748 +0000 UTC m=+1087.750742726" observedRunningTime="2026-03-17 01:27:44.167165886 +0000 UTC m=+1089.799398864" watchObservedRunningTime="2026-03-17 01:27:44.169244397 +0000 UTC m=+1089.801477365" Mar 17 01:27:45 crc kubenswrapper[4735]: I0317 01:27:45.083241 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb661cb5-2847-480f-9703-941bab7097d3" path="/var/lib/kubelet/pods/cb661cb5-2847-480f-9703-941bab7097d3/volumes" Mar 17 01:27:45 crc kubenswrapper[4735]: I0317 01:27:45.083826 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44f7503-90aa-4ac1-b470-6b122a53d291" path="/var/lib/kubelet/pods/f44f7503-90aa-4ac1-b470-6b122a53d291/volumes" Mar 17 01:27:45 crc kubenswrapper[4735]: I0317 01:27:45.472536 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:45 crc kubenswrapper[4735]: I0317 01:27:45.473412 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:46 crc kubenswrapper[4735]: I0317 01:27:46.527975 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vh5d5" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="registry-server" probeResult="failure" output=< Mar 17 01:27:46 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:27:46 crc kubenswrapper[4735]: > Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.213942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8507883-2fad-4c5c-90c9-58eb17711bb3","Type":"ContainerStarted","Data":"a5a7fb4d225698a72d2eecd5fede9d792faf5cddfaadd571107a26867ca3b81b"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.224976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" event={"ID":"d05a6355-3f32-4625-a6d4-9c0dbec18cb0","Type":"ContainerStarted","Data":"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.225060 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.227142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9ngz2" event={"ID":"8d35e229-2eeb-4843-a9a2-763156affef5","Type":"ContainerStarted","Data":"ce14bd37abdcbbfc1629273e9ebc78d836ddcfdeade9cd8e21a2b9d7cb619799"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.227981 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9ngz2" Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.233927 4735 generic.go:334] "Generic (PLEG): container finished" podID="e92e2c9c-efdb-43cf-9a17-cde88240c670" containerID="1cfc1c918440882cd9d8d2fabd0d36d2a5098a630cbbc071ce9f214366c6f295" exitCode=0 Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.234034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ffshg" event={"ID":"e92e2c9c-efdb-43cf-9a17-cde88240c670","Type":"ContainerDied","Data":"1cfc1c918440882cd9d8d2fabd0d36d2a5098a630cbbc071ce9f214366c6f295"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.239101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6be8d2b-aab5-415b-bb98-298563e9f719","Type":"ContainerStarted","Data":"7797c8508313d7399f17ca5eecdf1e3e4526b8d57d2907d1992baad835e546dd"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.251830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" event={"ID":"f0ff000b-fe0a-4dfc-b976-92c57cf5595a","Type":"ContainerStarted","Data":"33309d1c84731b36890aa55a86202a34401674495aff6484a5466bc6379ddce9"} Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.252473 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.253251 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" podStartSLOduration=6.678309299 podStartE2EDuration="34.253236852s" podCreationTimestamp="2026-03-17 01:27:13 +0000 UTC" firstStartedPulling="2026-03-17 01:27:14.383586942 +0000 UTC m=+1060.015819920" lastFinishedPulling="2026-03-17 01:27:41.958514495 +0000 UTC m=+1087.590747473" observedRunningTime="2026-03-17 01:27:47.248544948 +0000 UTC m=+1092.880777936" watchObservedRunningTime="2026-03-17 01:27:47.253236852 +0000 UTC m=+1092.885469830" Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.274913 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9ngz2" podStartSLOduration=21.696341705000002 podStartE2EDuration="26.274896851s" podCreationTimestamp="2026-03-17 01:27:21 +0000 UTC" firstStartedPulling="2026-03-17 01:27:41.710771142 +0000 UTC m=+1087.343004120" lastFinishedPulling="2026-03-17 01:27:46.289326268 +0000 UTC m=+1091.921559266" observedRunningTime="2026-03-17 01:27:47.269141451 +0000 UTC m=+1092.901374449" watchObservedRunningTime="2026-03-17 01:27:47.274896851 +0000 UTC m=+1092.907129829" Mar 17 01:27:47 crc kubenswrapper[4735]: I0317 01:27:47.314230 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" podStartSLOduration=-9223372001.540565 podStartE2EDuration="35.31421002s" podCreationTimestamp="2026-03-17 01:27:12 +0000 UTC" firstStartedPulling="2026-03-17 01:27:13.168452529 +0000 UTC m=+1058.800685507" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:27:47.305029816 +0000 UTC m=+1092.937262794" watchObservedRunningTime="2026-03-17 01:27:47.31421002 +0000 UTC m=+1092.946442998" Mar 17 01:27:48 crc kubenswrapper[4735]: I0317 01:27:48.273765 4735 generic.go:334] "Generic (PLEG): container finished" podID="479dabcd-a158-4f12-9baf-77981001d3b3" containerID="4bfae0953fc37f710f8622b2850595e89cf59f8e165ed44ad84f9ed755ba23fa" exitCode=0 Mar 17 01:27:48 crc kubenswrapper[4735]: I0317 01:27:48.274206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"479dabcd-a158-4f12-9baf-77981001d3b3","Type":"ContainerDied","Data":"4bfae0953fc37f710f8622b2850595e89cf59f8e165ed44ad84f9ed755ba23fa"} Mar 17 01:27:48 crc kubenswrapper[4735]: I0317 01:27:48.278180 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ffshg" event={"ID":"e92e2c9c-efdb-43cf-9a17-cde88240c670","Type":"ContainerStarted","Data":"e1dbd7b959da307c40d49be2c15d270e1f3b419c8ebcdece41f92698602cb14b"} Mar 17 01:27:48 crc kubenswrapper[4735]: I0317 01:27:48.278326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ffshg" event={"ID":"e92e2c9c-efdb-43cf-9a17-cde88240c670","Type":"ContainerStarted","Data":"5727bfdaad87d587fe68d8e63b388f233065d0dd1ef1a6dfe7ed7ba2992aab8e"} Mar 17 01:27:48 crc kubenswrapper[4735]: I0317 01:27:48.327653 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ffshg" podStartSLOduration=23.356310441 podStartE2EDuration="27.327638753s" podCreationTimestamp="2026-03-17 01:27:21 +0000 UTC" firstStartedPulling="2026-03-17 01:27:42.286377674 +0000 UTC m=+1087.918610652" lastFinishedPulling="2026-03-17 01:27:46.257705986 +0000 UTC m=+1091.889938964" observedRunningTime="2026-03-17 01:27:48.324775774 +0000 UTC m=+1093.957008752" watchObservedRunningTime="2026-03-17 01:27:48.327638753 +0000 UTC m=+1093.959871731" Mar 17 01:27:49 crc kubenswrapper[4735]: I0317 01:27:49.288997 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:49 crc kubenswrapper[4735]: I0317 01:27:49.289071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.309109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8507883-2fad-4c5c-90c9-58eb17711bb3","Type":"ContainerStarted","Data":"4dd86cd07c019e6409cee204899f26c5e809d4267ca566fa20bc98e758f35ef5"} Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.311590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"479dabcd-a158-4f12-9baf-77981001d3b3","Type":"ContainerStarted","Data":"a580d0fa43204a1d2c14bce504e4f47d07dc71bbaf8d865c8130b00470145b9d"} Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.314432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6be8d2b-aab5-415b-bb98-298563e9f719","Type":"ContainerStarted","Data":"352484fe837a5a2cf94103d8d56dfbf34443fcc9144e58262030f902ebc253aa"} Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.332384 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.983839204 podStartE2EDuration="26.33237053s" podCreationTimestamp="2026-03-17 01:27:24 +0000 UTC" firstStartedPulling="2026-03-17 01:27:41.710529566 +0000 UTC m=+1087.342762544" lastFinishedPulling="2026-03-17 01:27:50.059060872 +0000 UTC m=+1095.691293870" observedRunningTime="2026-03-17 01:27:50.329632473 +0000 UTC m=+1095.961865451" watchObservedRunningTime="2026-03-17 01:27:50.33237053 +0000 UTC m=+1095.964603508" Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.362651 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.593898837 podStartE2EDuration="37.362631418s" podCreationTimestamp="2026-03-17 01:27:13 +0000 UTC" firstStartedPulling="2026-03-17 01:27:16.072257337 +0000 UTC m=+1061.704490315" lastFinishedPulling="2026-03-17 01:27:41.840989918 +0000 UTC m=+1087.473222896" observedRunningTime="2026-03-17 01:27:50.357289717 +0000 UTC m=+1095.989522715" watchObservedRunningTime="2026-03-17 01:27:50.362631418 +0000 UTC m=+1095.994864406" Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.382794 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.011376206 podStartE2EDuration="26.38277393s" podCreationTimestamp="2026-03-17 01:27:24 +0000 UTC" firstStartedPulling="2026-03-17 01:27:41.700271135 +0000 UTC m=+1087.332504113" lastFinishedPulling="2026-03-17 01:27:50.071668849 +0000 UTC m=+1095.703901837" observedRunningTime="2026-03-17 01:27:50.376285011 +0000 UTC m=+1096.008518009" watchObservedRunningTime="2026-03-17 01:27:50.38277393 +0000 UTC m=+1096.015006918" Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.765061 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:50 crc kubenswrapper[4735]: I0317 01:27:50.891362 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:51 crc kubenswrapper[4735]: I0317 01:27:51.919006 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 17 01:27:52 crc kubenswrapper[4735]: I0317 01:27:52.706389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:27:52 crc kubenswrapper[4735]: I0317 01:27:52.764942 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:52 crc kubenswrapper[4735]: I0317 01:27:52.841723 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:52 crc kubenswrapper[4735]: I0317 01:27:52.892398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:52 crc kubenswrapper[4735]: I0317 01:27:52.923207 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.382682 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.387747 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.539966 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.540175 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="dnsmasq-dns" containerID="cri-o://15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03" gracePeriod=10 Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.541361 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.586204 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:53 crc kubenswrapper[4735]: E0317 01:27:53.586494 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="registry-server" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.586509 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="registry-server" Mar 17 01:27:53 crc kubenswrapper[4735]: E0317 01:27:53.586524 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="extract-utilities" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.586530 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="extract-utilities" Mar 17 01:27:53 crc kubenswrapper[4735]: E0317 01:27:53.586565 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="extract-content" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.586571 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="extract-content" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.586724 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccae0ab-f6d9-4cf3-aa34-1c5192ea3eb9" containerName="registry-server" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.587454 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.589428 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.598991 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.653030 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.653368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9kj\" (UniqueName: \"kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.653430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.653473 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.669520 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: connect: connection refused" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.755321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.755385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9kj\" (UniqueName: \"kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.755414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.755443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.756334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.756908 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.757752 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.788689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9kj\" (UniqueName: \"kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj\") pod \"dnsmasq-dns-c8cc8c8d7-b6gnr\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.790077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lg72z"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.793935 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.800590 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.856641 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.856727 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-config\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.857798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovn-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.857823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovs-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.857923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-combined-ca-bundle\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.857948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7vr5\" (UniqueName: \"kubernetes.io/projected/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-kube-api-access-l7vr5\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.871322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lg72z"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.909935 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.912094 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958743 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-config\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958786 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovn-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovs-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-combined-ca-bundle\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958874 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7vr5\" (UniqueName: \"kubernetes.io/projected/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-kube-api-access-l7vr5\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.958910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.960667 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-config\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.960977 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovn-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.961251 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-ovs-rundir\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.964027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-combined-ca-bundle\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.971506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.987915 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.989206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:53 crc kubenswrapper[4735]: I0317 01:27:53.999032 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.003174 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.008719 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7vr5\" (UniqueName: \"kubernetes.io/projected/5bf91339-aeb6-4e5f-b712-295b0fcd38b9-kube-api-access-l7vr5\") pod \"ovn-controller-metrics-lg72z\" (UID: \"5bf91339-aeb6-4e5f-b712-295b0fcd38b9\") " pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.044346 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.045729 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.055143 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.055483 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.055590 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.055587 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.055730 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kvsfk" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.061699 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.061739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.061763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5jd7\" (UniqueName: \"kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.061804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.061829 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.125124 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.170192 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lg72z" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183548 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config\") pod \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183588 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptttp\" (UniqueName: \"kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp\") pod \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc\") pod \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\" (UID: \"d05a6355-3f32-4625-a6d4-9c0dbec18cb0\") " Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183888 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183913 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-config\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.183995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ws7t\" (UniqueName: \"kubernetes.io/projected/ff00868f-9bd8-42cd-818b-b69373acd188-kube-api-access-9ws7t\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-scripts\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184174 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5jd7\" (UniqueName: \"kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.184271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.192736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.193225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.193551 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.194904 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp" (OuterVolumeSpecName: "kube-api-access-ptttp") pod "d05a6355-3f32-4625-a6d4-9c0dbec18cb0" (UID: "d05a6355-3f32-4625-a6d4-9c0dbec18cb0"). InnerVolumeSpecName "kube-api-access-ptttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.198349 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.217741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5jd7\" (UniqueName: \"kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7\") pod \"dnsmasq-dns-5cd56bc579-2ttvq\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.244846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d05a6355-3f32-4625-a6d4-9c0dbec18cb0" (UID: "d05a6355-3f32-4625-a6d4-9c0dbec18cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.266459 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config" (OuterVolumeSpecName: "config") pod "d05a6355-3f32-4625-a6d4-9c0dbec18cb0" (UID: "d05a6355-3f32-4625-a6d4-9c0dbec18cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-config\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288153 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288187 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ws7t\" (UniqueName: \"kubernetes.io/projected/ff00868f-9bd8-42cd-818b-b69373acd188-kube-api-access-9ws7t\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-scripts\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288355 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288367 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptttp\" (UniqueName: \"kubernetes.io/projected/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-kube-api-access-ptttp\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.288376 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05a6355-3f32-4625-a6d4-9c0dbec18cb0-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.289017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.289473 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-config\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.289875 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff00868f-9bd8-42cd-818b-b69373acd188-scripts\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.292271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.292530 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.293309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff00868f-9bd8-42cd-818b-b69373acd188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.305637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ws7t\" (UniqueName: \"kubernetes.io/projected/ff00868f-9bd8-42cd-818b-b69373acd188-kube-api-access-9ws7t\") pod \"ovn-northd-0\" (UID: \"ff00868f-9bd8-42cd-818b-b69373acd188\") " pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.349358 4735 generic.go:334] "Generic (PLEG): container finished" podID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerID="15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03" exitCode=0 Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.349907 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" event={"ID":"d05a6355-3f32-4625-a6d4-9c0dbec18cb0","Type":"ContainerDied","Data":"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03"} Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.349948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" event={"ID":"d05a6355-3f32-4625-a6d4-9c0dbec18cb0","Type":"ContainerDied","Data":"39c46b2e6a167aee8ed2b26e317a7b138063beb15176f740ceecc527ad884ba4"} Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.349967 4735 scope.go:117] "RemoveContainer" containerID="15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.349917 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86545856d7-wwdxv" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.376733 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.377680 4735 scope.go:117] "RemoveContainer" containerID="abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.382474 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.387980 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86545856d7-wwdxv"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.412795 4735 scope.go:117] "RemoveContainer" containerID="15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03" Mar 17 01:27:54 crc kubenswrapper[4735]: E0317 01:27:54.413357 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03\": container with ID starting with 15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03 not found: ID does not exist" containerID="15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.413398 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03"} err="failed to get container status \"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03\": rpc error: code = NotFound desc = could not find container \"15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03\": container with ID starting with 15632e220d6947ccedbd867998d85abff62dd571bef2a423d900c81e62ec7b03 not found: ID does not exist" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.413422 4735 scope.go:117] "RemoveContainer" containerID="abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b" Mar 17 01:27:54 crc kubenswrapper[4735]: E0317 01:27:54.413824 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b\": container with ID starting with abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b not found: ID does not exist" containerID="abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.413884 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b"} err="failed to get container status \"abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b\": rpc error: code = NotFound desc = could not find container \"abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b\": container with ID starting with abdfc480b3c38e52b320679eaa6b0002e95c23fd12e9d97ea16627e40b6a359b not found: ID does not exist" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.420950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.468384 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.629727 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lg72z"] Mar 17 01:27:54 crc kubenswrapper[4735]: W0317 01:27:54.643828 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf91339_aeb6_4e5f_b712_295b0fcd38b9.slice/crio-9a34781edfc745bc6cd9350f1d5fc502fe9c86d2c30bfef9be51e0cb9c2d6187 WatchSource:0}: Error finding container 9a34781edfc745bc6cd9350f1d5fc502fe9c86d2c30bfef9be51e0cb9c2d6187: Status 404 returned error can't find the container with id 9a34781edfc745bc6cd9350f1d5fc502fe9c86d2c30bfef9be51e0cb9c2d6187 Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.667333 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:27:54 crc kubenswrapper[4735]: W0317 01:27:54.690110 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39de41a_dde6_454c_b73e_12d8a0935746.slice/crio-359b6ae1e6085a8dc593f99df856fac2082e2123e8792dac77e7564985dad7ac WatchSource:0}: Error finding container 359b6ae1e6085a8dc593f99df856fac2082e2123e8792dac77e7564985dad7ac: Status 404 returned error can't find the container with id 359b6ae1e6085a8dc593f99df856fac2082e2123e8792dac77e7564985dad7ac Mar 17 01:27:54 crc kubenswrapper[4735]: I0317 01:27:54.992602 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 01:27:55 crc kubenswrapper[4735]: W0317 01:27:55.011899 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff00868f_9bd8_42cd_818b_b69373acd188.slice/crio-03ae02be2ae9cfe8d20e40636be631039d1170dd9e57e4cb94f701f8a489ac09 WatchSource:0}: Error finding container 03ae02be2ae9cfe8d20e40636be631039d1170dd9e57e4cb94f701f8a489ac09: Status 404 returned error can't find the container with id 03ae02be2ae9cfe8d20e40636be631039d1170dd9e57e4cb94f701f8a489ac09 Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.085274 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" path="/var/lib/kubelet/pods/d05a6355-3f32-4625-a6d4-9c0dbec18cb0/volumes" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.286566 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.287897 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.366197 4735 generic.go:334] "Generic (PLEG): container finished" podID="a39de41a-dde6-454c-b73e-12d8a0935746" containerID="3634b58e9fd43637745f7f2d14bd8a7cdc4afdc20d4289058567f9b030ff320b" exitCode=0 Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.366284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" event={"ID":"a39de41a-dde6-454c-b73e-12d8a0935746","Type":"ContainerDied","Data":"3634b58e9fd43637745f7f2d14bd8a7cdc4afdc20d4289058567f9b030ff320b"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.366309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" event={"ID":"a39de41a-dde6-454c-b73e-12d8a0935746","Type":"ContainerStarted","Data":"359b6ae1e6085a8dc593f99df856fac2082e2123e8792dac77e7564985dad7ac"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.370882 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lg72z" event={"ID":"5bf91339-aeb6-4e5f-b712-295b0fcd38b9","Type":"ContainerStarted","Data":"5f3d8aef65d67aa5d94a4a05cfeeaa37f0d77e7e4d2c1f34c40453a1a7a9b6af"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.370924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lg72z" event={"ID":"5bf91339-aeb6-4e5f-b712-295b0fcd38b9","Type":"ContainerStarted","Data":"9a34781edfc745bc6cd9350f1d5fc502fe9c86d2c30bfef9be51e0cb9c2d6187"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.393470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff00868f-9bd8-42cd-818b-b69373acd188","Type":"ContainerStarted","Data":"03ae02be2ae9cfe8d20e40636be631039d1170dd9e57e4cb94f701f8a489ac09"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.414547 4735 generic.go:334] "Generic (PLEG): container finished" podID="c55d4ec2-37a3-4bef-853c-d52ef64c08fe" containerID="b0c481d40fb0d89c223ed9abdb630e9f1306e4407218da4d846a7519d1eebaab" exitCode=0 Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.414625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" event={"ID":"c55d4ec2-37a3-4bef-853c-d52ef64c08fe","Type":"ContainerDied","Data":"b0c481d40fb0d89c223ed9abdb630e9f1306e4407218da4d846a7519d1eebaab"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.414650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" event={"ID":"c55d4ec2-37a3-4bef-853c-d52ef64c08fe","Type":"ContainerStarted","Data":"7d5747251ae42bdde40ec6aed5ffb3577950567fb62b1c3e7cafdedc380e87a5"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.455367 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerStarted","Data":"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423"} Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.468684 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lg72z" podStartSLOduration=2.468667222 podStartE2EDuration="2.468667222s" podCreationTimestamp="2026-03-17 01:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:27:55.420954248 +0000 UTC m=+1101.053187226" watchObservedRunningTime="2026-03-17 01:27:55.468667222 +0000 UTC m=+1101.100900200" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.567981 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.623433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.635317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.800805 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:55 crc kubenswrapper[4735]: I0317 01:27:55.874757 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.015695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb\") pod \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.015780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config\") pod \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.015940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk9kj\" (UniqueName: \"kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj\") pod \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.015972 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc\") pod \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\" (UID: \"c55d4ec2-37a3-4bef-853c-d52ef64c08fe\") " Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.021841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj" (OuterVolumeSpecName: "kube-api-access-dk9kj") pod "c55d4ec2-37a3-4bef-853c-d52ef64c08fe" (UID: "c55d4ec2-37a3-4bef-853c-d52ef64c08fe"). InnerVolumeSpecName "kube-api-access-dk9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.037536 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c55d4ec2-37a3-4bef-853c-d52ef64c08fe" (UID: "c55d4ec2-37a3-4bef-853c-d52ef64c08fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.043101 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config" (OuterVolumeSpecName: "config") pod "c55d4ec2-37a3-4bef-853c-d52ef64c08fe" (UID: "c55d4ec2-37a3-4bef-853c-d52ef64c08fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.043496 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c55d4ec2-37a3-4bef-853c-d52ef64c08fe" (UID: "c55d4ec2-37a3-4bef-853c-d52ef64c08fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.117879 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk9kj\" (UniqueName: \"kubernetes.io/projected/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-kube-api-access-dk9kj\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.118085 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.118150 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.118209 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55d4ec2-37a3-4bef-853c-d52ef64c08fe-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.463043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" event={"ID":"a39de41a-dde6-454c-b73e-12d8a0935746","Type":"ContainerStarted","Data":"60c72e8360603a750e0ac19c1c75fc66e664381449c7577e001b0a88aeae1fc3"} Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.463597 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.465292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff00868f-9bd8-42cd-818b-b69373acd188","Type":"ContainerStarted","Data":"8fc2e141678cf661f532d105a6a31070e0ef819d9e5a485bc1249afe0644954d"} Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.467509 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" event={"ID":"c55d4ec2-37a3-4bef-853c-d52ef64c08fe","Type":"ContainerDied","Data":"7d5747251ae42bdde40ec6aed5ffb3577950567fb62b1c3e7cafdedc380e87a5"} Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.467555 4735 scope.go:117] "RemoveContainer" containerID="b0c481d40fb0d89c223ed9abdb630e9f1306e4407218da4d846a7519d1eebaab" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.468015 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8cc8c8d7-b6gnr" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.470187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerStarted","Data":"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155"} Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.473556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"632237b7-0f5e-426d-ae9e-e434ac0e1da6","Type":"ContainerStarted","Data":"1536fe5e333d5a20e40588cb24ce1ca2f4e6fa6537e0eeb1d8f761c9bf0e62f8"} Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.485891 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" podStartSLOduration=3.485845656 podStartE2EDuration="3.485845656s" podCreationTimestamp="2026-03-17 01:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:27:56.484689068 +0000 UTC m=+1102.116922046" watchObservedRunningTime="2026-03-17 01:27:56.485845656 +0000 UTC m=+1102.118078644" Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.567134 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.567208 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c8cc8c8d7-b6gnr"] Mar 17 01:27:56 crc kubenswrapper[4735]: I0317 01:27:56.619795 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.085941 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55d4ec2-37a3-4bef-853c-d52ef64c08fe" path="/var/lib/kubelet/pods/c55d4ec2-37a3-4bef-853c-d52ef64c08fe/volumes" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.484236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff00868f-9bd8-42cd-818b-b69373acd188","Type":"ContainerStarted","Data":"c128f0d8deda4ad3f32b1a545f08c1d8d5aee12236e93e719381295fb4bc4d0c"} Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.484415 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.486619 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vh5d5" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="registry-server" containerID="cri-o://b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb" gracePeriod=2 Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.517409 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.349081468 podStartE2EDuration="4.51739174s" podCreationTimestamp="2026-03-17 01:27:53 +0000 UTC" firstStartedPulling="2026-03-17 01:27:55.014058981 +0000 UTC m=+1100.646291959" lastFinishedPulling="2026-03-17 01:27:56.182369253 +0000 UTC m=+1101.814602231" observedRunningTime="2026-03-17 01:27:57.516748915 +0000 UTC m=+1103.148981903" watchObservedRunningTime="2026-03-17 01:27:57.51739174 +0000 UTC m=+1103.149624729" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.660558 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1db9-account-create-update-sqmm7"] Mar 17 01:27:57 crc kubenswrapper[4735]: E0317 01:27:57.661087 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="init" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661103 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="init" Mar 17 01:27:57 crc kubenswrapper[4735]: E0317 01:27:57.661129 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="dnsmasq-dns" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661135 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="dnsmasq-dns" Mar 17 01:27:57 crc kubenswrapper[4735]: E0317 01:27:57.661157 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55d4ec2-37a3-4bef-853c-d52ef64c08fe" containerName="init" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661162 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55d4ec2-37a3-4bef-853c-d52ef64c08fe" containerName="init" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661308 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55d4ec2-37a3-4bef-853c-d52ef64c08fe" containerName="init" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661318 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05a6355-3f32-4625-a6d4-9c0dbec18cb0" containerName="dnsmasq-dns" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.661780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.665642 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.672954 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1db9-account-create-update-sqmm7"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.745619 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-nw7cf"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.746892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.754195 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xscgb\" (UniqueName: \"kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.754301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.760190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nw7cf"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.837114 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-v6brj"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.838081 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v6brj" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.855894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.855960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xscgb\" (UniqueName: \"kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.856021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6zlv\" (UniqueName: \"kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.856072 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.856765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.866803 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v6brj"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.920846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xscgb\" (UniqueName: \"kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb\") pod \"keystone-1db9-account-create-update-sqmm7\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.957877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6zlv\" (UniqueName: \"kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.957949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.958008 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.958026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdvc\" (UniqueName: \"kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.958971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.974585 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0c55-account-create-update-7msf2"] Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.975526 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.977377 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.988579 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:27:57 crc kubenswrapper[4735]: I0317 01:27:57.989491 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6zlv\" (UniqueName: \"kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv\") pod \"keystone-db-create-nw7cf\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.007329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0c55-account-create-update-7msf2"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.059394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.059576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.059640 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz9s4\" (UniqueName: \"kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.059688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdvc\" (UniqueName: \"kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.061400 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.070370 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.076455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdvc\" (UniqueName: \"kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc\") pod \"placement-db-create-v6brj\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " pod="openstack/placement-db-create-v6brj" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.088433 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nw7cf" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.157159 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v6brj" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.160375 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjkzl\" (UniqueName: \"kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl\") pod \"7c13f928-2917-4244-8f49-bbb64c0e65ff\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.160455 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content\") pod \"7c13f928-2917-4244-8f49-bbb64c0e65ff\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.160581 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities\") pod \"7c13f928-2917-4244-8f49-bbb64c0e65ff\" (UID: \"7c13f928-2917-4244-8f49-bbb64c0e65ff\") " Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.160819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz9s4\" (UniqueName: \"kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.160900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.161266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities" (OuterVolumeSpecName: "utilities") pod "7c13f928-2917-4244-8f49-bbb64c0e65ff" (UID: "7c13f928-2917-4244-8f49-bbb64c0e65ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.162390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.177767 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz9s4\" (UniqueName: \"kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4\") pod \"placement-0c55-account-create-update-7msf2\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.191042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl" (OuterVolumeSpecName: "kube-api-access-rjkzl") pod "7c13f928-2917-4244-8f49-bbb64c0e65ff" (UID: "7c13f928-2917-4244-8f49-bbb64c0e65ff"). InnerVolumeSpecName "kube-api-access-rjkzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.231929 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c13f928-2917-4244-8f49-bbb64c0e65ff" (UID: "7c13f928-2917-4244-8f49-bbb64c0e65ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.267346 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.267581 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c13f928-2917-4244-8f49-bbb64c0e65ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.267590 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjkzl\" (UniqueName: \"kubernetes.io/projected/7c13f928-2917-4244-8f49-bbb64c0e65ff-kube-api-access-rjkzl\") on node \"crc\" DevicePath \"\"" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.382724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.484074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1db9-account-create-update-sqmm7"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.543881 4735 generic.go:334] "Generic (PLEG): container finished" podID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerID="b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb" exitCode=0 Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.543960 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerDied","Data":"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb"} Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.543985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh5d5" event={"ID":"7c13f928-2917-4244-8f49-bbb64c0e65ff","Type":"ContainerDied","Data":"d01eea389788f35ed5843477937abb0fc3eff8c1185f0d9fbdd8eb0ec8c4f046"} Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.544001 4735 scope.go:117] "RemoveContainer" containerID="b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.544149 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh5d5" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.551512 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1dfe3748-6604-4210-a284-1f0bfdcdf01f","Type":"ContainerStarted","Data":"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3"} Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.552841 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.574180 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.1565047 podStartE2EDuration="39.574165171s" podCreationTimestamp="2026-03-17 01:27:19 +0000 UTC" firstStartedPulling="2026-03-17 01:27:20.047562707 +0000 UTC m=+1065.679795685" lastFinishedPulling="2026-03-17 01:27:57.465223138 +0000 UTC m=+1103.097456156" observedRunningTime="2026-03-17 01:27:58.573464234 +0000 UTC m=+1104.205697212" watchObservedRunningTime="2026-03-17 01:27:58.574165171 +0000 UTC m=+1104.206398149" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.580055 4735 scope.go:117] "RemoveContainer" containerID="cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.611173 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.616742 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vh5d5"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.625088 4735 scope.go:117] "RemoveContainer" containerID="74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.627563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nw7cf"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.633916 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v6brj"] Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.676787 4735 scope.go:117] "RemoveContainer" containerID="b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb" Mar 17 01:27:58 crc kubenswrapper[4735]: E0317 01:27:58.681690 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb\": container with ID starting with b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb not found: ID does not exist" containerID="b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.681720 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb"} err="failed to get container status \"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb\": rpc error: code = NotFound desc = could not find container \"b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb\": container with ID starting with b663374afd35036cf639dc968cecab16940109d0c99b631f78ff31622d2511bb not found: ID does not exist" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.681740 4735 scope.go:117] "RemoveContainer" containerID="cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025" Mar 17 01:27:58 crc kubenswrapper[4735]: E0317 01:27:58.682073 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025\": container with ID starting with cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025 not found: ID does not exist" containerID="cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.682107 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025"} err="failed to get container status \"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025\": rpc error: code = NotFound desc = could not find container \"cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025\": container with ID starting with cccdd51f8387119818ba8c25a456891a995ebc539e7d31e9927f29829ee82025 not found: ID does not exist" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.682131 4735 scope.go:117] "RemoveContainer" containerID="74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5" Mar 17 01:27:58 crc kubenswrapper[4735]: E0317 01:27:58.682381 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5\": container with ID starting with 74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5 not found: ID does not exist" containerID="74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.682404 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5"} err="failed to get container status \"74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5\": rpc error: code = NotFound desc = could not find container \"74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5\": container with ID starting with 74837f50d2fabae7a1c7ba08e7a1009b81a4c20c6d6ac1414cae23fadb3b3bc5 not found: ID does not exist" Mar 17 01:27:58 crc kubenswrapper[4735]: I0317 01:27:58.845514 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0c55-account-create-update-7msf2"] Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.081830 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" path="/var/lib/kubelet/pods/7c13f928-2917-4244-8f49-bbb64c0e65ff/volumes" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.370195 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.373346 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="dnsmasq-dns" containerID="cri-o://60c72e8360603a750e0ac19c1c75fc66e664381449c7577e001b0a88aeae1fc3" gracePeriod=10 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.417158 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:27:59 crc kubenswrapper[4735]: E0317 01:27:59.417429 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="extract-utilities" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.417445 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="extract-utilities" Mar 17 01:27:59 crc kubenswrapper[4735]: E0317 01:27:59.417465 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="extract-content" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.417471 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="extract-content" Mar 17 01:27:59 crc kubenswrapper[4735]: E0317 01:27:59.417480 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="registry-server" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.417495 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="registry-server" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.417628 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c13f928-2917-4244-8f49-bbb64c0e65ff" containerName="registry-server" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.419631 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.434320 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.500507 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.500554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.500601 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.500646 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.500712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47blw\" (UniqueName: \"kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.562181 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee3dfa49-bee3-46cf-982d-6b126b72a46b" containerID="d0c1f186b72363bd57f61da70ab2cff09fccf6706495b67df421fd61a118cab3" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.562261 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0c55-account-create-update-7msf2" event={"ID":"ee3dfa49-bee3-46cf-982d-6b126b72a46b","Type":"ContainerDied","Data":"d0c1f186b72363bd57f61da70ab2cff09fccf6706495b67df421fd61a118cab3"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.562286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0c55-account-create-update-7msf2" event={"ID":"ee3dfa49-bee3-46cf-982d-6b126b72a46b","Type":"ContainerStarted","Data":"050bc1c178b834cebcab2c51eaf3f2538131e11f1bcde6b6ee79ffe2ba87137a"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.566128 4735 generic.go:334] "Generic (PLEG): container finished" podID="6678bd59-bc5f-4441-b5c7-f68abbf8f385" containerID="037fb4e644505ab8593322897ca936e5d8b7c7a907380a68a6d599453e33be17" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.566257 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v6brj" event={"ID":"6678bd59-bc5f-4441-b5c7-f68abbf8f385","Type":"ContainerDied","Data":"037fb4e644505ab8593322897ca936e5d8b7c7a907380a68a6d599453e33be17"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.566332 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v6brj" event={"ID":"6678bd59-bc5f-4441-b5c7-f68abbf8f385","Type":"ContainerStarted","Data":"bcac8410a17ca7d9730ca606019712c575620c27721a921ebe597643c33c7406"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.567398 4735 generic.go:334] "Generic (PLEG): container finished" podID="c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" containerID="c3c74813aabd7080ea6c26cf817a2dfdb7b4a87ac38118760ef61dd31cd50a24" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.567506 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1db9-account-create-update-sqmm7" event={"ID":"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9","Type":"ContainerDied","Data":"c3c74813aabd7080ea6c26cf817a2dfdb7b4a87ac38118760ef61dd31cd50a24"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.567578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1db9-account-create-update-sqmm7" event={"ID":"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9","Type":"ContainerStarted","Data":"10e4c63ecbbfc1f2574eef151cbdd8a78bdba2706e0e650092c60b666c10d028"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.569235 4735 generic.go:334] "Generic (PLEG): container finished" podID="1e6b4495-838e-4cda-9d1f-7b087fe4ec50" containerID="4d6ad7277b88a97c02ef32dc3a50769fc6d1b17c26a84bbf63a7578431d82106" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.569293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nw7cf" event={"ID":"1e6b4495-838e-4cda-9d1f-7b087fe4ec50","Type":"ContainerDied","Data":"4d6ad7277b88a97c02ef32dc3a50769fc6d1b17c26a84bbf63a7578431d82106"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.569357 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nw7cf" event={"ID":"1e6b4495-838e-4cda-9d1f-7b087fe4ec50","Type":"ContainerStarted","Data":"2c0f90e414388bd466319b05ba9f5705e1fdc65602252019c0a2abe0a86d104d"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.570547 4735 generic.go:334] "Generic (PLEG): container finished" podID="632237b7-0f5e-426d-ae9e-e434ac0e1da6" containerID="1536fe5e333d5a20e40588cb24ce1ca2f4e6fa6537e0eeb1d8f761c9bf0e62f8" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.570647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"632237b7-0f5e-426d-ae9e-e434ac0e1da6","Type":"ContainerDied","Data":"1536fe5e333d5a20e40588cb24ce1ca2f4e6fa6537e0eeb1d8f761c9bf0e62f8"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.593029 4735 generic.go:334] "Generic (PLEG): container finished" podID="a39de41a-dde6-454c-b73e-12d8a0935746" containerID="60c72e8360603a750e0ac19c1c75fc66e664381449c7577e001b0a88aeae1fc3" exitCode=0 Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.593709 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" event={"ID":"a39de41a-dde6-454c-b73e-12d8a0935746","Type":"ContainerDied","Data":"60c72e8360603a750e0ac19c1c75fc66e664381449c7577e001b0a88aeae1fc3"} Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.603284 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.608877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47blw\" (UniqueName: \"kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.609012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.609092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.609191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.604189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.609976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.610113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.611375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.643161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47blw\" (UniqueName: \"kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw\") pod \"dnsmasq-dns-6c8c8d4885-pj7bf\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.781802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.817250 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.914645 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config\") pod \"a39de41a-dde6-454c-b73e-12d8a0935746\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.914995 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb\") pod \"a39de41a-dde6-454c-b73e-12d8a0935746\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.915094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc\") pod \"a39de41a-dde6-454c-b73e-12d8a0935746\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.915146 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb\") pod \"a39de41a-dde6-454c-b73e-12d8a0935746\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.915184 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5jd7\" (UniqueName: \"kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7\") pod \"a39de41a-dde6-454c-b73e-12d8a0935746\" (UID: \"a39de41a-dde6-454c-b73e-12d8a0935746\") " Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.930048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7" (OuterVolumeSpecName: "kube-api-access-d5jd7") pod "a39de41a-dde6-454c-b73e-12d8a0935746" (UID: "a39de41a-dde6-454c-b73e-12d8a0935746"). InnerVolumeSpecName "kube-api-access-d5jd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:27:59 crc kubenswrapper[4735]: I0317 01:27:59.977490 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a39de41a-dde6-454c-b73e-12d8a0935746" (UID: "a39de41a-dde6-454c-b73e-12d8a0935746"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.004535 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a39de41a-dde6-454c-b73e-12d8a0935746" (UID: "a39de41a-dde6-454c-b73e-12d8a0935746"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.012467 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config" (OuterVolumeSpecName: "config") pod "a39de41a-dde6-454c-b73e-12d8a0935746" (UID: "a39de41a-dde6-454c-b73e-12d8a0935746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.018252 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.018272 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.018282 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.018292 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5jd7\" (UniqueName: \"kubernetes.io/projected/a39de41a-dde6-454c-b73e-12d8a0935746-kube-api-access-d5jd7\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.088366 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a39de41a-dde6-454c-b73e-12d8a0935746" (UID: "a39de41a-dde6-454c-b73e-12d8a0935746"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.120232 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39de41a-dde6-454c-b73e-12d8a0935746-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.127225 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561848-4xvkv"] Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.127503 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="dnsmasq-dns" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.127518 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="dnsmasq-dns" Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.127534 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="init" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.127540 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="init" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.127672 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" containerName="dnsmasq-dns" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.130653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.134850 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.135068 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.135248 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.136573 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-4xvkv"] Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.221736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvr8\" (UniqueName: \"kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8\") pod \"auto-csr-approver-29561848-4xvkv\" (UID: \"9b0e19e4-fcba-4981-ad27-32c13a744be5\") " pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.315382 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:28:00 crc kubenswrapper[4735]: W0317 01:28:00.316143 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7541744_776e_4960_8eaf_039cc5489054.slice/crio-4f1ff03b3ee65d1d7cf197d27a3a10922994b066801b21bc6be5a7d234f0d89c WatchSource:0}: Error finding container 4f1ff03b3ee65d1d7cf197d27a3a10922994b066801b21bc6be5a7d234f0d89c: Status 404 returned error can't find the container with id 4f1ff03b3ee65d1d7cf197d27a3a10922994b066801b21bc6be5a7d234f0d89c Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.323388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvr8\" (UniqueName: \"kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8\") pod \"auto-csr-approver-29561848-4xvkv\" (UID: \"9b0e19e4-fcba-4981-ad27-32c13a744be5\") " pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.340762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvr8\" (UniqueName: \"kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8\") pod \"auto-csr-approver-29561848-4xvkv\" (UID: \"9b0e19e4-fcba-4981-ad27-32c13a744be5\") " pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.455093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.552543 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.558879 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.565326 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.565479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xhkvh" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.565617 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.565744 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.582581 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.615026 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" event={"ID":"a39de41a-dde6-454c-b73e-12d8a0935746","Type":"ContainerDied","Data":"359b6ae1e6085a8dc593f99df856fac2082e2123e8792dac77e7564985dad7ac"} Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.615064 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd56bc579-2ttvq" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.615099 4735 scope.go:117] "RemoveContainer" containerID="60c72e8360603a750e0ac19c1c75fc66e664381449c7577e001b0a88aeae1fc3" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.618441 4735 generic.go:334] "Generic (PLEG): container finished" podID="b7541744-776e-4960-8eaf-039cc5489054" containerID="9fc0178fa3830d69abf382ab8624df041a2d63aad099493eb0e79ae3543559a1" exitCode=0 Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.618501 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" event={"ID":"b7541744-776e-4960-8eaf-039cc5489054","Type":"ContainerDied","Data":"9fc0178fa3830d69abf382ab8624df041a2d63aad099493eb0e79ae3543559a1"} Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.618519 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" event={"ID":"b7541744-776e-4960-8eaf-039cc5489054","Type":"ContainerStarted","Data":"4f1ff03b3ee65d1d7cf197d27a3a10922994b066801b21bc6be5a7d234f0d89c"} Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627744 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630046af-37dd-4e3d-9109-c130caec8508-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-lock\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627803 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdch\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-kube-api-access-wpdch\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.627888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-cache\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.633773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"632237b7-0f5e-426d-ae9e-e434ac0e1da6","Type":"ContainerStarted","Data":"ca623e791e8d8b94617bfc89b7594353f4fda9511086786298c092563bb8d3ca"} Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.658017 4735 scope.go:117] "RemoveContainer" containerID="3634b58e9fd43637745f7f2d14bd8a7cdc4afdc20d4289058567f9b030ff320b" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738356 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-cache\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630046af-37dd-4e3d-9109-c130caec8508-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-lock\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.738534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdch\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-kube-api-access-wpdch\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.739242 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.742208 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-lock\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.744302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/630046af-37dd-4e3d-9109-c130caec8508-cache\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.744382 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.744394 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.744452 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:01.244437046 +0000 UTC m=+1106.876670024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.766359 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371990.088436 podStartE2EDuration="46.76634059s" podCreationTimestamp="2026-03-17 01:27:14 +0000 UTC" firstStartedPulling="2026-03-17 01:27:17.231472197 +0000 UTC m=+1062.863705175" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:00.689543547 +0000 UTC m=+1106.321776535" watchObservedRunningTime="2026-03-17 01:28:00.76634059 +0000 UTC m=+1106.398573568" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.770692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630046af-37dd-4e3d-9109-c130caec8508-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.774008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdch\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-kube-api-access-wpdch\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.780273 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.793469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:00 crc kubenswrapper[4735]: I0317 01:28:00.794175 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd56bc579-2ttvq"] Mar 17 01:28:00 crc kubenswrapper[4735]: E0317 01:28:00.968003 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39de41a_dde6_454c_b73e_12d8a0935746.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.037903 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-4xvkv"] Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.171536 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39de41a-dde6-454c-b73e-12d8a0935746" path="/var/lib/kubelet/pods/a39de41a-dde6-454c-b73e-12d8a0935746/volumes" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.221052 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.287047 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:01 crc kubenswrapper[4735]: E0317 01:28:01.288580 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:01 crc kubenswrapper[4735]: E0317 01:28:01.288609 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:01 crc kubenswrapper[4735]: E0317 01:28:01.288646 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:02.288631199 +0000 UTC m=+1107.920864177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.387803 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts\") pod \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.388166 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz9s4\" (UniqueName: \"kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4\") pod \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\" (UID: \"ee3dfa49-bee3-46cf-982d-6b126b72a46b\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.388586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee3dfa49-bee3-46cf-982d-6b126b72a46b" (UID: "ee3dfa49-bee3-46cf-982d-6b126b72a46b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.395892 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4" (OuterVolumeSpecName: "kube-api-access-zz9s4") pod "ee3dfa49-bee3-46cf-982d-6b126b72a46b" (UID: "ee3dfa49-bee3-46cf-982d-6b126b72a46b"). InnerVolumeSpecName "kube-api-access-zz9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.423120 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nw7cf" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.426008 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.457129 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v6brj" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.489341 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3dfa49-bee3-46cf-982d-6b126b72a46b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.489384 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz9s4\" (UniqueName: \"kubernetes.io/projected/ee3dfa49-bee3-46cf-982d-6b126b72a46b-kube-api-access-zz9s4\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.590739 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts\") pod \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.590799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts\") pod \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.590842 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts\") pod \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.590946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6zlv\" (UniqueName: \"kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv\") pod \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\" (UID: \"1e6b4495-838e-4cda-9d1f-7b087fe4ec50\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.591013 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xscgb\" (UniqueName: \"kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb\") pod \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\" (UID: \"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.591090 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdvc\" (UniqueName: \"kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc\") pod \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\" (UID: \"6678bd59-bc5f-4441-b5c7-f68abbf8f385\") " Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.591275 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e6b4495-838e-4cda-9d1f-7b087fe4ec50" (UID: "1e6b4495-838e-4cda-9d1f-7b087fe4ec50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.591735 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.592109 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" (UID: "c2b81ce1-9517-4b0f-bae1-fd5e586e98a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.592359 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6678bd59-bc5f-4441-b5c7-f68abbf8f385" (UID: "6678bd59-bc5f-4441-b5c7-f68abbf8f385"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.594433 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb" (OuterVolumeSpecName: "kube-api-access-xscgb") pod "c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" (UID: "c2b81ce1-9517-4b0f-bae1-fd5e586e98a9"). InnerVolumeSpecName "kube-api-access-xscgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.594809 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv" (OuterVolumeSpecName: "kube-api-access-d6zlv") pod "1e6b4495-838e-4cda-9d1f-7b087fe4ec50" (UID: "1e6b4495-838e-4cda-9d1f-7b087fe4ec50"). InnerVolumeSpecName "kube-api-access-d6zlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.595150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc" (OuterVolumeSpecName: "kube-api-access-2cdvc") pod "6678bd59-bc5f-4441-b5c7-f68abbf8f385" (UID: "6678bd59-bc5f-4441-b5c7-f68abbf8f385"). InnerVolumeSpecName "kube-api-access-2cdvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.641973 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0c55-account-create-update-7msf2" event={"ID":"ee3dfa49-bee3-46cf-982d-6b126b72a46b","Type":"ContainerDied","Data":"050bc1c178b834cebcab2c51eaf3f2538131e11f1bcde6b6ee79ffe2ba87137a"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.642008 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="050bc1c178b834cebcab2c51eaf3f2538131e11f1bcde6b6ee79ffe2ba87137a" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.642017 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0c55-account-create-update-7msf2" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.644737 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v6brj" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.644930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v6brj" event={"ID":"6678bd59-bc5f-4441-b5c7-f68abbf8f385","Type":"ContainerDied","Data":"bcac8410a17ca7d9730ca606019712c575620c27721a921ebe597643c33c7406"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.645026 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcac8410a17ca7d9730ca606019712c575620c27721a921ebe597643c33c7406" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.646721 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1db9-account-create-update-sqmm7" event={"ID":"c2b81ce1-9517-4b0f-bae1-fd5e586e98a9","Type":"ContainerDied","Data":"10e4c63ecbbfc1f2574eef151cbdd8a78bdba2706e0e650092c60b666c10d028"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.646916 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e4c63ecbbfc1f2574eef151cbdd8a78bdba2706e0e650092c60b666c10d028" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.646894 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1db9-account-create-update-sqmm7" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.654469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" event={"ID":"b7541744-776e-4960-8eaf-039cc5489054","Type":"ContainerStarted","Data":"eb842bb48868aa1b7162a7c93662eff6b6860d3187ab9c2594f0d1da0ab24dbb"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.654769 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.656102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nw7cf" event={"ID":"1e6b4495-838e-4cda-9d1f-7b087fe4ec50","Type":"ContainerDied","Data":"2c0f90e414388bd466319b05ba9f5705e1fdc65602252019c0a2abe0a86d104d"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.656132 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c0f90e414388bd466319b05ba9f5705e1fdc65602252019c0a2abe0a86d104d" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.656109 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nw7cf" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.657289 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" event={"ID":"9b0e19e4-fcba-4981-ad27-32c13a744be5","Type":"ContainerStarted","Data":"2395274b17683fdc6e8417661c8e27f160f95b3dd16945dd6dcc4ed6cd187f1c"} Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.677508 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" podStartSLOduration=2.67748859 podStartE2EDuration="2.67748859s" podCreationTimestamp="2026-03-17 01:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:01.674686021 +0000 UTC m=+1107.306918999" watchObservedRunningTime="2026-03-17 01:28:01.67748859 +0000 UTC m=+1107.309721568" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.693889 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6zlv\" (UniqueName: \"kubernetes.io/projected/1e6b4495-838e-4cda-9d1f-7b087fe4ec50-kube-api-access-d6zlv\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.693919 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xscgb\" (UniqueName: \"kubernetes.io/projected/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-kube-api-access-xscgb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.693950 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdvc\" (UniqueName: \"kubernetes.io/projected/6678bd59-bc5f-4441-b5c7-f68abbf8f385-kube-api-access-2cdvc\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.693959 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:01 crc kubenswrapper[4735]: I0317 01:28:01.693970 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6678bd59-bc5f-4441-b5c7-f68abbf8f385-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.165525 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fjwfz"] Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.166696 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6678bd59-bc5f-4441-b5c7-f68abbf8f385" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.166804 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6678bd59-bc5f-4441-b5c7-f68abbf8f385" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.166945 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6b4495-838e-4cda-9d1f-7b087fe4ec50" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.167045 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6b4495-838e-4cda-9d1f-7b087fe4ec50" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.167165 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.167269 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.167377 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3dfa49-bee3-46cf-982d-6b126b72a46b" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.167592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3dfa49-bee3-46cf-982d-6b126b72a46b" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.167936 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6b4495-838e-4cda-9d1f-7b087fe4ec50" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.168070 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3dfa49-bee3-46cf-982d-6b126b72a46b" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.168188 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6678bd59-bc5f-4441-b5c7-f68abbf8f385" containerName="mariadb-database-create" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.168294 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" containerName="mariadb-account-create-update" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.169056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.180118 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fjwfz"] Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.292205 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-af95-account-create-update-lvxl5"] Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.293832 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.297374 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.303618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrqb\" (UniqueName: \"kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.303680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.303729 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.303941 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.303959 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:02 crc kubenswrapper[4735]: E0317 01:28:02.303998 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:04.303983988 +0000 UTC m=+1109.936216966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.330046 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-af95-account-create-update-lvxl5"] Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.404901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.404948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.405032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqj6\" (UniqueName: \"kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.405054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrqb\" (UniqueName: \"kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.405610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.425915 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrqb\" (UniqueName: \"kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb\") pod \"glance-db-create-fjwfz\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.495136 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.506537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqj6\" (UniqueName: \"kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.506628 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.507236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.544174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqj6\" (UniqueName: \"kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6\") pod \"glance-af95-account-create-update-lvxl5\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.625493 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.672980 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b0e19e4-fcba-4981-ad27-32c13a744be5" containerID="25ff3155c58eacfa3b7a0810dcefee81ac46ae8c2e4337c4492a17e869a9c7a2" exitCode=0 Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.673268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" event={"ID":"9b0e19e4-fcba-4981-ad27-32c13a744be5","Type":"ContainerDied","Data":"25ff3155c58eacfa3b7a0810dcefee81ac46ae8c2e4337c4492a17e869a9c7a2"} Mar 17 01:28:02 crc kubenswrapper[4735]: I0317 01:28:02.939065 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fjwfz"] Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.065048 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-af95-account-create-update-lvxl5"] Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.692028 4735 generic.go:334] "Generic (PLEG): container finished" podID="76ee9a4a-f88f-498f-b41d-d5dd25a00f41" containerID="6e6d50c8976a0d13bc7bc7f61efc7bcec9c0cea3cc50648f8545f814963fb1ca" exitCode=0 Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.692483 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fjwfz" event={"ID":"76ee9a4a-f88f-498f-b41d-d5dd25a00f41","Type":"ContainerDied","Data":"6e6d50c8976a0d13bc7bc7f61efc7bcec9c0cea3cc50648f8545f814963fb1ca"} Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.693790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fjwfz" event={"ID":"76ee9a4a-f88f-498f-b41d-d5dd25a00f41","Type":"ContainerStarted","Data":"766def7a9253298a646a1897950ed6610bc1823070522283a0323f804f0f09ee"} Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.696789 4735 generic.go:334] "Generic (PLEG): container finished" podID="d02b1480-3872-40f9-98fb-f04132fba4c4" containerID="3708d5d739a7b0d7bfe2ee97b550886a59361062400c7e342597fa5baad6e278" exitCode=0 Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.696848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-af95-account-create-update-lvxl5" event={"ID":"d02b1480-3872-40f9-98fb-f04132fba4c4","Type":"ContainerDied","Data":"3708d5d739a7b0d7bfe2ee97b550886a59361062400c7e342597fa5baad6e278"} Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.696899 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-af95-account-create-update-lvxl5" event={"ID":"d02b1480-3872-40f9-98fb-f04132fba4c4","Type":"ContainerStarted","Data":"68af368fe6cabb46e194746ffb6be1096260588c1ab626aaae2f86b2b7f23af3"} Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.860891 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8f2wd"] Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.867213 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.871479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 01:28:03 crc kubenswrapper[4735]: I0317 01:28:03.873596 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8f2wd"] Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.028392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkdwb\" (UniqueName: \"kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.028698 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.068789 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.130062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkdwb\" (UniqueName: \"kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.130118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.130825 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.148116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkdwb\" (UniqueName: \"kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb\") pod \"root-account-create-update-8f2wd\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.183804 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.230944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtvr8\" (UniqueName: \"kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8\") pod \"9b0e19e4-fcba-4981-ad27-32c13a744be5\" (UID: \"9b0e19e4-fcba-4981-ad27-32c13a744be5\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.234622 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8" (OuterVolumeSpecName: "kube-api-access-qtvr8") pod "9b0e19e4-fcba-4981-ad27-32c13a744be5" (UID: "9b0e19e4-fcba-4981-ad27-32c13a744be5"). InnerVolumeSpecName "kube-api-access-qtvr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.333867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:04 crc kubenswrapper[4735]: E0317 01:28:04.334141 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:04 crc kubenswrapper[4735]: E0317 01:28:04.334164 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:04 crc kubenswrapper[4735]: E0317 01:28:04.334255 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:08.334235144 +0000 UTC m=+1113.966468132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.334304 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtvr8\" (UniqueName: \"kubernetes.io/projected/9b0e19e4-fcba-4981-ad27-32c13a744be5-kube-api-access-qtvr8\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.441536 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8vgqg"] Mar 17 01:28:04 crc kubenswrapper[4735]: E0317 01:28:04.441850 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0e19e4-fcba-4981-ad27-32c13a744be5" containerName="oc" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.441878 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0e19e4-fcba-4981-ad27-32c13a744be5" containerName="oc" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.442042 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0e19e4-fcba-4981-ad27-32c13a744be5" containerName="oc" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.446089 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.448844 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.449045 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.449956 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.480114 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8vgqg"] Mar 17 01:28:04 crc kubenswrapper[4735]: E0317 01:28:04.480502 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-xw8gh ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-8vgqg" podUID="2b931f33-1975-4be5-bcaa-6b72ae7de486" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.490606 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j75nx"] Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.491658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.505934 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8vgqg"] Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.517443 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j75nx"] Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.537478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.537528 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.537554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.537771 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.538028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.538119 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8gh\" (UniqueName: \"kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.538154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.616238 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8f2wd"] Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8gh\" (UniqueName: \"kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640294 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640387 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p46bp\" (UniqueName: \"kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640493 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.640580 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.641902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.642780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.643016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.646392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.646442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.647792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.661001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8gh\" (UniqueName: \"kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh\") pod \"swift-ring-rebalance-8vgqg\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.704836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" event={"ID":"9b0e19e4-fcba-4981-ad27-32c13a744be5","Type":"ContainerDied","Data":"2395274b17683fdc6e8417661c8e27f160f95b3dd16945dd6dcc4ed6cd187f1c"} Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.705127 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2395274b17683fdc6e8417661c8e27f160f95b3dd16945dd6dcc4ed6cd187f1c" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.704883 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-4xvkv" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.712747 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.712976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8f2wd" event={"ID":"cdf03766-83e4-4899-b5df-ef8821dccd88","Type":"ContainerStarted","Data":"07aff1ea76c75dad5a9248ed2161a8275e376b90cd26899ab89b242caa00849c"} Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.721968 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.741960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p46bp\" (UniqueName: \"kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.742152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.743298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.743724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.745695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.747146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.749418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.749530 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.767301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p46bp\" (UniqueName: \"kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp\") pod \"swift-ring-rebalance-j75nx\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.810504 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845404 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845428 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw8gh\" (UniqueName: \"kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845474 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845510 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845596 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.845615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf\") pod \"2b931f33-1975-4be5-bcaa-6b72ae7de486\" (UID: \"2b931f33-1975-4be5-bcaa-6b72ae7de486\") " Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.846397 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.846670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts" (OuterVolumeSpecName: "scripts") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.846799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.850335 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.852551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.853366 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh" (OuterVolumeSpecName: "kube-api-access-xw8gh") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "kube-api-access-xw8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.854933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2b931f33-1975-4be5-bcaa-6b72ae7de486" (UID: "2b931f33-1975-4be5-bcaa-6b72ae7de486"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948405 4735 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948435 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b931f33-1975-4be5-bcaa-6b72ae7de486-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948445 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw8gh\" (UniqueName: \"kubernetes.io/projected/2b931f33-1975-4be5-bcaa-6b72ae7de486-kube-api-access-xw8gh\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948457 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948466 4735 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b931f33-1975-4be5-bcaa-6b72ae7de486-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948478 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.948486 4735 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b931f33-1975-4be5-bcaa-6b72ae7de486-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:04 crc kubenswrapper[4735]: I0317 01:28:04.986575 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.132066 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-kckdb"] Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.134944 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-kckdb"] Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.152418 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdrqb\" (UniqueName: \"kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb\") pod \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.152509 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts\") pod \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\" (UID: \"76ee9a4a-f88f-498f-b41d-d5dd25a00f41\") " Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.153248 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76ee9a4a-f88f-498f-b41d-d5dd25a00f41" (UID: "76ee9a4a-f88f-498f-b41d-d5dd25a00f41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.156591 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb" (OuterVolumeSpecName: "kube-api-access-kdrqb") pod "76ee9a4a-f88f-498f-b41d-d5dd25a00f41" (UID: "76ee9a4a-f88f-498f-b41d-d5dd25a00f41"). InnerVolumeSpecName "kube-api-access-kdrqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.213703 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.254364 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.254400 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdrqb\" (UniqueName: \"kubernetes.io/projected/76ee9a4a-f88f-498f-b41d-d5dd25a00f41-kube-api-access-kdrqb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.356199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts\") pod \"d02b1480-3872-40f9-98fb-f04132fba4c4\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.356270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njqj6\" (UniqueName: \"kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6\") pod \"d02b1480-3872-40f9-98fb-f04132fba4c4\" (UID: \"d02b1480-3872-40f9-98fb-f04132fba4c4\") " Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.356679 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d02b1480-3872-40f9-98fb-f04132fba4c4" (UID: "d02b1480-3872-40f9-98fb-f04132fba4c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.373955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6" (OuterVolumeSpecName: "kube-api-access-njqj6") pod "d02b1480-3872-40f9-98fb-f04132fba4c4" (UID: "d02b1480-3872-40f9-98fb-f04132fba4c4"). InnerVolumeSpecName "kube-api-access-njqj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.405891 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j75nx"] Mar 17 01:28:05 crc kubenswrapper[4735]: W0317 01:28:05.413916 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2639420_3978_4b55_a81e_f1e770e09cf2.slice/crio-5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f WatchSource:0}: Error finding container 5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f: Status 404 returned error can't find the container with id 5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.458107 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02b1480-3872-40f9-98fb-f04132fba4c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.458160 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njqj6\" (UniqueName: \"kubernetes.io/projected/d02b1480-3872-40f9-98fb-f04132fba4c4-kube-api-access-njqj6\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.722778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-af95-account-create-update-lvxl5" event={"ID":"d02b1480-3872-40f9-98fb-f04132fba4c4","Type":"ContainerDied","Data":"68af368fe6cabb46e194746ffb6be1096260588c1ab626aaae2f86b2b7f23af3"} Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.722837 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68af368fe6cabb46e194746ffb6be1096260588c1ab626aaae2f86b2b7f23af3" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.722941 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-af95-account-create-update-lvxl5" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.729623 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fjwfz" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.729628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fjwfz" event={"ID":"76ee9a4a-f88f-498f-b41d-d5dd25a00f41","Type":"ContainerDied","Data":"766def7a9253298a646a1897950ed6610bc1823070522283a0323f804f0f09ee"} Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.729759 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="766def7a9253298a646a1897950ed6610bc1823070522283a0323f804f0f09ee" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.731283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j75nx" event={"ID":"a2639420-3978-4b55-a81e-f1e770e09cf2","Type":"ContainerStarted","Data":"5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f"} Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.734682 4735 generic.go:334] "Generic (PLEG): container finished" podID="cdf03766-83e4-4899-b5df-ef8821dccd88" containerID="7e5dafcede5736a2446e1649ad54298b053c86bcbf9d9fddcdec87aaf10e00e2" exitCode=0 Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.734747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8f2wd" event={"ID":"cdf03766-83e4-4899-b5df-ef8821dccd88","Type":"ContainerDied","Data":"7e5dafcede5736a2446e1649ad54298b053c86bcbf9d9fddcdec87aaf10e00e2"} Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.734758 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8vgqg" Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.808242 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8vgqg"] Mar 17 01:28:05 crc kubenswrapper[4735]: I0317 01:28:05.814404 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-8vgqg"] Mar 17 01:28:06 crc kubenswrapper[4735]: I0317 01:28:06.274480 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 17 01:28:06 crc kubenswrapper[4735]: I0317 01:28:06.274966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.347143 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2266c1f2-ce32-49ff-83aa-174c9ce402c2" path="/var/lib/kubelet/pods/2266c1f2-ce32-49ff-83aa-174c9ce402c2/volumes" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.350145 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b931f33-1975-4be5-bcaa-6b72ae7de486" path="/var/lib/kubelet/pods/2b931f33-1975-4be5-bcaa-6b72ae7de486/volumes" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.551306 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vsj4g"] Mar 17 01:28:07 crc kubenswrapper[4735]: E0317 01:28:07.551658 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b1480-3872-40f9-98fb-f04132fba4c4" containerName="mariadb-account-create-update" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.551677 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b1480-3872-40f9-98fb-f04132fba4c4" containerName="mariadb-account-create-update" Mar 17 01:28:07 crc kubenswrapper[4735]: E0317 01:28:07.551698 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ee9a4a-f88f-498f-b41d-d5dd25a00f41" containerName="mariadb-database-create" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.551704 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ee9a4a-f88f-498f-b41d-d5dd25a00f41" containerName="mariadb-database-create" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.551876 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02b1480-3872-40f9-98fb-f04132fba4c4" containerName="mariadb-account-create-update" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.551898 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ee9a4a-f88f-498f-b41d-d5dd25a00f41" containerName="mariadb-database-create" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.552366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.554913 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.555180 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-db8vv" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.563377 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vsj4g"] Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.624227 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.687312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmscb\" (UniqueName: \"kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.687379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.687410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.687488 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.750457 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.788904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.788986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmscb\" (UniqueName: \"kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.789068 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.789110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.796778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.797345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.798449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.807085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmscb\" (UniqueName: \"kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb\") pod \"glance-db-sync-vsj4g\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.873391 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.890220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts\") pod \"cdf03766-83e4-4899-b5df-ef8821dccd88\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.890298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkdwb\" (UniqueName: \"kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb\") pod \"cdf03766-83e4-4899-b5df-ef8821dccd88\" (UID: \"cdf03766-83e4-4899-b5df-ef8821dccd88\") " Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.891165 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdf03766-83e4-4899-b5df-ef8821dccd88" (UID: "cdf03766-83e4-4899-b5df-ef8821dccd88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.893339 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb" (OuterVolumeSpecName: "kube-api-access-qkdwb") pod "cdf03766-83e4-4899-b5df-ef8821dccd88" (UID: "cdf03766-83e4-4899-b5df-ef8821dccd88"). InnerVolumeSpecName "kube-api-access-qkdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.992583 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkdwb\" (UniqueName: \"kubernetes.io/projected/cdf03766-83e4-4899-b5df-ef8821dccd88-kube-api-access-qkdwb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:07 crc kubenswrapper[4735]: I0317 01:28:07.992838 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf03766-83e4-4899-b5df-ef8821dccd88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.339443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8f2wd" event={"ID":"cdf03766-83e4-4899-b5df-ef8821dccd88","Type":"ContainerDied","Data":"07aff1ea76c75dad5a9248ed2161a8275e376b90cd26899ab89b242caa00849c"} Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.339495 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07aff1ea76c75dad5a9248ed2161a8275e376b90cd26899ab89b242caa00849c" Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.339547 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8f2wd" Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.398769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:08 crc kubenswrapper[4735]: E0317 01:28:08.399044 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:08 crc kubenswrapper[4735]: E0317 01:28:08.399057 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:08 crc kubenswrapper[4735]: E0317 01:28:08.399102 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:16.399086545 +0000 UTC m=+1122.031319523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.402645 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vsj4g"] Mar 17 01:28:08 crc kubenswrapper[4735]: I0317 01:28:08.427114 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 17 01:28:09 crc kubenswrapper[4735]: I0317 01:28:09.356150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vsj4g" event={"ID":"70c0c683-4152-4360-a49f-22a72c25ad1c","Type":"ContainerStarted","Data":"48661a93debc0dfe72e8ac7ff59f02b4e6ed3e7122cb4ed48103b40ebb333f17"} Mar 17 01:28:09 crc kubenswrapper[4735]: I0317 01:28:09.409709 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 01:28:09 crc kubenswrapper[4735]: I0317 01:28:09.784544 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:28:09 crc kubenswrapper[4735]: I0317 01:28:09.888144 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:28:09 crc kubenswrapper[4735]: I0317 01:28:09.888362 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="dnsmasq-dns" containerID="cri-o://33309d1c84731b36890aa55a86202a34401674495aff6484a5466bc6379ddce9" gracePeriod=10 Mar 17 01:28:10 crc kubenswrapper[4735]: I0317 01:28:10.366111 4735 generic.go:334] "Generic (PLEG): container finished" podID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerID="33309d1c84731b36890aa55a86202a34401674495aff6484a5466bc6379ddce9" exitCode=0 Mar 17 01:28:10 crc kubenswrapper[4735]: I0317 01:28:10.366172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" event={"ID":"f0ff000b-fe0a-4dfc-b976-92c57cf5595a","Type":"ContainerDied","Data":"33309d1c84731b36890aa55a86202a34401674495aff6484a5466bc6379ddce9"} Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.046397 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.221341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc\") pod \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.221419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config\") pod \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.221629 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cdhk\" (UniqueName: \"kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk\") pod \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\" (UID: \"f0ff000b-fe0a-4dfc-b976-92c57cf5595a\") " Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.227341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk" (OuterVolumeSpecName: "kube-api-access-8cdhk") pod "f0ff000b-fe0a-4dfc-b976-92c57cf5595a" (UID: "f0ff000b-fe0a-4dfc-b976-92c57cf5595a"). InnerVolumeSpecName "kube-api-access-8cdhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.258133 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0ff000b-fe0a-4dfc-b976-92c57cf5595a" (UID: "f0ff000b-fe0a-4dfc-b976-92c57cf5595a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.267236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config" (OuterVolumeSpecName: "config") pod "f0ff000b-fe0a-4dfc-b976-92c57cf5595a" (UID: "f0ff000b-fe0a-4dfc-b976-92c57cf5595a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.323652 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.323907 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cdhk\" (UniqueName: \"kubernetes.io/projected/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-kube-api-access-8cdhk\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.324008 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ff000b-fe0a-4dfc-b976-92c57cf5595a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.401338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" event={"ID":"f0ff000b-fe0a-4dfc-b976-92c57cf5595a","Type":"ContainerDied","Data":"52c552fb04d515ae54ae648570a82c16f1c7afe8d60adbfc3e0f1cb65ff05c1d"} Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.401640 4735 scope.go:117] "RemoveContainer" containerID="33309d1c84731b36890aa55a86202a34401674495aff6484a5466bc6379ddce9" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.401922 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.410559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j75nx" event={"ID":"a2639420-3978-4b55-a81e-f1e770e09cf2","Type":"ContainerStarted","Data":"a2d79010f1a0a2a923577ba063a298972b0404b7ab569b78d03840086122a191"} Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.456395 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j75nx" podStartSLOduration=1.94685992 podStartE2EDuration="10.456371s" podCreationTimestamp="2026-03-17 01:28:04 +0000 UTC" firstStartedPulling="2026-03-17 01:28:05.415944378 +0000 UTC m=+1111.048177366" lastFinishedPulling="2026-03-17 01:28:13.925455478 +0000 UTC m=+1119.557688446" observedRunningTime="2026-03-17 01:28:14.431404184 +0000 UTC m=+1120.063637182" watchObservedRunningTime="2026-03-17 01:28:14.456371 +0000 UTC m=+1120.088603978" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.459528 4735 scope.go:117] "RemoveContainer" containerID="d3a33f154c8b57b81758cc751cc78e3f8d0ce52df0d78d8d12312cd69776a736" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.469041 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.486580 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dcf85566c-m8vnj"] Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.510972 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.804910 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8f2wd"] Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.809919 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8f2wd"] Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.903714 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2c69q"] Mar 17 01:28:14 crc kubenswrapper[4735]: E0317 01:28:14.904199 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="dnsmasq-dns" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.904220 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="dnsmasq-dns" Mar 17 01:28:14 crc kubenswrapper[4735]: E0317 01:28:14.904244 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf03766-83e4-4899-b5df-ef8821dccd88" containerName="mariadb-account-create-update" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.904254 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf03766-83e4-4899-b5df-ef8821dccd88" containerName="mariadb-account-create-update" Mar 17 01:28:14 crc kubenswrapper[4735]: E0317 01:28:14.904276 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="init" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.904285 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="init" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.904470 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="dnsmasq-dns" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.904492 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf03766-83e4-4899-b5df-ef8821dccd88" containerName="mariadb-account-create-update" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.905172 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.910397 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 17 01:28:14 crc kubenswrapper[4735]: I0317 01:28:14.913435 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2c69q"] Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.047574 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.047681 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crl6t\" (UniqueName: \"kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.085011 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf03766-83e4-4899-b5df-ef8821dccd88" path="/var/lib/kubelet/pods/cdf03766-83e4-4899-b5df-ef8821dccd88/volumes" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.085663 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" path="/var/lib/kubelet/pods/f0ff000b-fe0a-4dfc-b976-92c57cf5595a/volumes" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.149375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.149513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crl6t\" (UniqueName: \"kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.150507 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.165778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crl6t\" (UniqueName: \"kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t\") pod \"root-account-create-update-2c69q\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.236478 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:15 crc kubenswrapper[4735]: I0317 01:28:15.707364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2c69q"] Mar 17 01:28:15 crc kubenswrapper[4735]: W0317 01:28:15.718210 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e4061ea_70e0_4c6d_9b6b_8d4598ce0aef.slice/crio-bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44 WatchSource:0}: Error finding container bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44: Status 404 returned error can't find the container with id bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44 Mar 17 01:28:16 crc kubenswrapper[4735]: I0317 01:28:16.455178 4735 generic.go:334] "Generic (PLEG): container finished" podID="3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" containerID="b7539710c9b3565c378067c0fd583547bb5bacd3f09bbe9b6ea86658c5361ecd" exitCode=0 Mar 17 01:28:16 crc kubenswrapper[4735]: I0317 01:28:16.455223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2c69q" event={"ID":"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef","Type":"ContainerDied","Data":"b7539710c9b3565c378067c0fd583547bb5bacd3f09bbe9b6ea86658c5361ecd"} Mar 17 01:28:16 crc kubenswrapper[4735]: I0317 01:28:16.455246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2c69q" event={"ID":"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef","Type":"ContainerStarted","Data":"bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44"} Mar 17 01:28:16 crc kubenswrapper[4735]: I0317 01:28:16.497012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:16 crc kubenswrapper[4735]: E0317 01:28:16.497142 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 01:28:16 crc kubenswrapper[4735]: E0317 01:28:16.497171 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 01:28:16 crc kubenswrapper[4735]: E0317 01:28:16.497229 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift podName:630046af-37dd-4e3d-9109-c130caec8508 nodeName:}" failed. No retries permitted until 2026-03-17 01:28:32.497211638 +0000 UTC m=+1138.129444616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift") pod "swift-storage-0" (UID: "630046af-37dd-4e3d-9109-c130caec8508") : configmap "swift-ring-files" not found Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.369718 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9ngz2" podUID="8d35e229-2eeb-4843-a9a2-763156affef5" containerName="ovn-controller" probeResult="failure" output=< Mar 17 01:28:17 crc kubenswrapper[4735]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 17 01:28:17 crc kubenswrapper[4735]: > Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.378823 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.384412 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ffshg" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.635940 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9ngz2-config-t4bzp"] Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.636973 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.640946 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.650877 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9ngz2-config-t4bzp"] Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.704876 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-dcf85566c-m8vnj" podUID="f0ff000b-fe0a-4dfc-b976-92c57cf5595a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: i/o timeout" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.794159 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.818975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.819011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.819034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.819103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.819190 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcld2\" (UniqueName: \"kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.819219 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.920536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts\") pod \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.920655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crl6t\" (UniqueName: \"kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t\") pod \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\" (UID: \"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef\") " Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.920960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcld2\" (UniqueName: \"kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" (UID: "3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.921381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.922146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.922221 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.922968 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.923200 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.928393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t" (OuterVolumeSpecName: "kube-api-access-crl6t") pod "3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" (UID: "3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef"). InnerVolumeSpecName "kube-api-access-crl6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.938144 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcld2\" (UniqueName: \"kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2\") pod \"ovn-controller-9ngz2-config-t4bzp\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:17 crc kubenswrapper[4735]: I0317 01:28:17.971098 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.022634 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.022950 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crl6t\" (UniqueName: \"kubernetes.io/projected/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef-kube-api-access-crl6t\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.417086 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9ngz2-config-t4bzp"] Mar 17 01:28:18 crc kubenswrapper[4735]: W0317 01:28:18.424073 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83c86aca_f8f9_41b4_9247_c112a39231f0.slice/crio-3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1 WatchSource:0}: Error finding container 3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1: Status 404 returned error can't find the container with id 3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1 Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.470768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9ngz2-config-t4bzp" event={"ID":"83c86aca-f8f9-41b4-9247-c112a39231f0","Type":"ContainerStarted","Data":"3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1"} Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.472206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2c69q" event={"ID":"3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef","Type":"ContainerDied","Data":"bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44"} Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.472352 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf901e306c97f83a01e30f641bc001dbf0601fc5abc2a4df9eba125911fbde44" Mar 17 01:28:18 crc kubenswrapper[4735]: I0317 01:28:18.472245 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2c69q" Mar 17 01:28:19 crc kubenswrapper[4735]: I0317 01:28:19.481428 4735 generic.go:334] "Generic (PLEG): container finished" podID="83c86aca-f8f9-41b4-9247-c112a39231f0" containerID="97bb2d235347544a900063a9c3ad48b344862bc6348b84beba23bd1304ca3278" exitCode=0 Mar 17 01:28:19 crc kubenswrapper[4735]: I0317 01:28:19.481482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9ngz2-config-t4bzp" event={"ID":"83c86aca-f8f9-41b4-9247-c112a39231f0","Type":"ContainerDied","Data":"97bb2d235347544a900063a9c3ad48b344862bc6348b84beba23bd1304ca3278"} Mar 17 01:28:21 crc kubenswrapper[4735]: I0317 01:28:21.495430 4735 generic.go:334] "Generic (PLEG): container finished" podID="a2639420-3978-4b55-a81e-f1e770e09cf2" containerID="a2d79010f1a0a2a923577ba063a298972b0404b7ab569b78d03840086122a191" exitCode=0 Mar 17 01:28:21 crc kubenswrapper[4735]: I0317 01:28:21.495522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j75nx" event={"ID":"a2639420-3978-4b55-a81e-f1e770e09cf2","Type":"ContainerDied","Data":"a2d79010f1a0a2a923577ba063a298972b0404b7ab569b78d03840086122a191"} Mar 17 01:28:22 crc kubenswrapper[4735]: I0317 01:28:22.375464 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9ngz2" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.215811 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.224646 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcld2\" (UniqueName: \"kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320691 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320719 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320787 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.320822 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.321549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.321986 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322297 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts" (OuterVolumeSpecName: "scripts") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322505 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322675 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run\") pod \"83c86aca-f8f9-41b4-9247-c112a39231f0\" (UID: \"83c86aca-f8f9-41b4-9247-c112a39231f0\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p46bp\" (UniqueName: \"kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322795 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322816 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf\") pod \"a2639420-3978-4b55-a81e-f1e770e09cf2\" (UID: \"a2639420-3978-4b55-a81e-f1e770e09cf2\") " Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322938 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run" (OuterVolumeSpecName: "var-run") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.322961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323501 4735 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323517 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323526 4735 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323535 4735 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323544 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c86aca-f8f9-41b4-9247-c112a39231f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323551 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a2639420-3978-4b55-a81e-f1e770e09cf2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.323558 4735 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c86aca-f8f9-41b4-9247-c112a39231f0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.331549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp" (OuterVolumeSpecName: "kube-api-access-p46bp") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "kube-api-access-p46bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.333240 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.333917 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2" (OuterVolumeSpecName: "kube-api-access-bcld2") pod "83c86aca-f8f9-41b4-9247-c112a39231f0" (UID: "83c86aca-f8f9-41b4-9247-c112a39231f0"). InnerVolumeSpecName "kube-api-access-bcld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.354116 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts" (OuterVolumeSpecName: "scripts") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.359506 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.371971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2639420-3978-4b55-a81e-f1e770e09cf2" (UID: "a2639420-3978-4b55-a81e-f1e770e09cf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424460 4735 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424492 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcld2\" (UniqueName: \"kubernetes.io/projected/83c86aca-f8f9-41b4-9247-c112a39231f0-kube-api-access-bcld2\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424505 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2639420-3978-4b55-a81e-f1e770e09cf2-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424513 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p46bp\" (UniqueName: \"kubernetes.io/projected/a2639420-3978-4b55-a81e-f1e770e09cf2-kube-api-access-p46bp\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424522 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.424529 4735 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a2639420-3978-4b55-a81e-f1e770e09cf2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.562294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9ngz2-config-t4bzp" event={"ID":"83c86aca-f8f9-41b4-9247-c112a39231f0","Type":"ContainerDied","Data":"3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1"} Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.562351 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bbd3c6dc875c79d0a010dec25596918d14a90171a987464d4fb0d7f996870b1" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.562451 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9ngz2-config-t4bzp" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.569431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j75nx" event={"ID":"a2639420-3978-4b55-a81e-f1e770e09cf2","Type":"ContainerDied","Data":"5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f"} Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.569566 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7c6161cce47fd2ccf64f28f96a0b67b7418a2496a29cf989c4888a6ad5a22f" Mar 17 01:28:26 crc kubenswrapper[4735]: I0317 01:28:26.569522 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j75nx" Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.399296 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9ngz2-config-t4bzp"] Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.408262 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9ngz2-config-t4bzp"] Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.581828 4735 generic.go:334] "Generic (PLEG): container finished" podID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerID="7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155" exitCode=0 Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.581958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerDied","Data":"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155"} Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.583763 4735 generic.go:334] "Generic (PLEG): container finished" podID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerID="d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423" exitCode=0 Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.583799 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerDied","Data":"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423"} Mar 17 01:28:27 crc kubenswrapper[4735]: I0317 01:28:27.585563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vsj4g" event={"ID":"70c0c683-4152-4360-a49f-22a72c25ad1c","Type":"ContainerStarted","Data":"11c8ba407d5421d1afc3bcd2b34a7955529e2b4abea59081714815d3bc27d06a"} Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.597658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerStarted","Data":"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6"} Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.598230 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.600239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerStarted","Data":"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e"} Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.600549 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.636968 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vsj4g" podStartSLOduration=3.834731051 podStartE2EDuration="21.636947879s" podCreationTimestamp="2026-03-17 01:28:07 +0000 UTC" firstStartedPulling="2026-03-17 01:28:08.418331119 +0000 UTC m=+1114.050564097" lastFinishedPulling="2026-03-17 01:28:26.220547947 +0000 UTC m=+1131.852780925" observedRunningTime="2026-03-17 01:28:27.665420629 +0000 UTC m=+1133.297653607" watchObservedRunningTime="2026-03-17 01:28:28.636947879 +0000 UTC m=+1134.269180857" Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.638870 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.781033713 podStartE2EDuration="1m15.638866797s" podCreationTimestamp="2026-03-17 01:27:13 +0000 UTC" firstStartedPulling="2026-03-17 01:27:15.480287907 +0000 UTC m=+1061.112520885" lastFinishedPulling="2026-03-17 01:27:54.338120991 +0000 UTC m=+1099.970353969" observedRunningTime="2026-03-17 01:28:28.633783271 +0000 UTC m=+1134.266016289" watchObservedRunningTime="2026-03-17 01:28:28.638866797 +0000 UTC m=+1134.271099775" Mar 17 01:28:28 crc kubenswrapper[4735]: I0317 01:28:28.695306 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.280376112 podStartE2EDuration="1m16.695289887s" podCreationTimestamp="2026-03-17 01:27:12 +0000 UTC" firstStartedPulling="2026-03-17 01:27:14.930776861 +0000 UTC m=+1060.563009839" lastFinishedPulling="2026-03-17 01:27:54.345690636 +0000 UTC m=+1099.977923614" observedRunningTime="2026-03-17 01:28:28.693736328 +0000 UTC m=+1134.325969306" watchObservedRunningTime="2026-03-17 01:28:28.695289887 +0000 UTC m=+1134.327522865" Mar 17 01:28:29 crc kubenswrapper[4735]: I0317 01:28:29.086222 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c86aca-f8f9-41b4-9247-c112a39231f0" path="/var/lib/kubelet/pods/83c86aca-f8f9-41b4-9247-c112a39231f0/volumes" Mar 17 01:28:32 crc kubenswrapper[4735]: I0317 01:28:32.519933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:32 crc kubenswrapper[4735]: I0317 01:28:32.529024 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/630046af-37dd-4e3d-9109-c130caec8508-etc-swift\") pod \"swift-storage-0\" (UID: \"630046af-37dd-4e3d-9109-c130caec8508\") " pod="openstack/swift-storage-0" Mar 17 01:28:32 crc kubenswrapper[4735]: I0317 01:28:32.686153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 01:28:33 crc kubenswrapper[4735]: I0317 01:28:33.249230 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 01:28:33 crc kubenswrapper[4735]: I0317 01:28:33.632634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"4df09d4334dab19b91b64f13ec3c2b9b5084b35775a7eac2c51a37f82b7dd5ed"} Mar 17 01:28:34 crc kubenswrapper[4735]: I0317 01:28:34.642454 4735 generic.go:334] "Generic (PLEG): container finished" podID="70c0c683-4152-4360-a49f-22a72c25ad1c" containerID="11c8ba407d5421d1afc3bcd2b34a7955529e2b4abea59081714815d3bc27d06a" exitCode=0 Mar 17 01:28:34 crc kubenswrapper[4735]: I0317 01:28:34.643121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vsj4g" event={"ID":"70c0c683-4152-4360-a49f-22a72c25ad1c","Type":"ContainerDied","Data":"11c8ba407d5421d1afc3bcd2b34a7955529e2b4abea59081714815d3bc27d06a"} Mar 17 01:28:34 crc kubenswrapper[4735]: I0317 01:28:34.645789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"401dbf68296a626a1e1e64a9966096844fd676c96b8e06bfdb51cc2f98a64faa"} Mar 17 01:28:34 crc kubenswrapper[4735]: I0317 01:28:34.645827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"26cc8ffb405b2c25b8710e21ba39c4e648c4f6c4b5c1bcb17f7070859d3f4e6b"} Mar 17 01:28:35 crc kubenswrapper[4735]: I0317 01:28:35.660836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"a75eb31dbc2dac24debcb5bf9915ba611b8a596ba1ddbcf446c95a538008cdbe"} Mar 17 01:28:35 crc kubenswrapper[4735]: I0317 01:28:35.661137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"eb7f4e375315f150bd8483fdd90f0e648d86bbf755afeb55eab1e7961d813021"} Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.008711 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.192655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle\") pod \"70c0c683-4152-4360-a49f-22a72c25ad1c\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.192976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmscb\" (UniqueName: \"kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb\") pod \"70c0c683-4152-4360-a49f-22a72c25ad1c\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.193038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data\") pod \"70c0c683-4152-4360-a49f-22a72c25ad1c\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.193087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data\") pod \"70c0c683-4152-4360-a49f-22a72c25ad1c\" (UID: \"70c0c683-4152-4360-a49f-22a72c25ad1c\") " Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.200953 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "70c0c683-4152-4360-a49f-22a72c25ad1c" (UID: "70c0c683-4152-4360-a49f-22a72c25ad1c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.204108 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb" (OuterVolumeSpecName: "kube-api-access-jmscb") pod "70c0c683-4152-4360-a49f-22a72c25ad1c" (UID: "70c0c683-4152-4360-a49f-22a72c25ad1c"). InnerVolumeSpecName "kube-api-access-jmscb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.222006 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c0c683-4152-4360-a49f-22a72c25ad1c" (UID: "70c0c683-4152-4360-a49f-22a72c25ad1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.241502 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data" (OuterVolumeSpecName: "config-data") pod "70c0c683-4152-4360-a49f-22a72c25ad1c" (UID: "70c0c683-4152-4360-a49f-22a72c25ad1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.296071 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.296103 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmscb\" (UniqueName: \"kubernetes.io/projected/70c0c683-4152-4360-a49f-22a72c25ad1c-kube-api-access-jmscb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.296116 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.296124 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c0c683-4152-4360-a49f-22a72c25ad1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.673655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vsj4g" event={"ID":"70c0c683-4152-4360-a49f-22a72c25ad1c","Type":"ContainerDied","Data":"48661a93debc0dfe72e8ac7ff59f02b4e6ed3e7122cb4ed48103b40ebb333f17"} Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.673699 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48661a93debc0dfe72e8ac7ff59f02b4e6ed3e7122cb4ed48103b40ebb333f17" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.673762 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vsj4g" Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.681024 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"8c4afa3e12a468643195ebc637f6b97533c68638cc0ac2f5e69c0cf66cee5ba0"} Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.681066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"4726f184ea2c2fd4311af8df9ae4e2825580357fe2b10627151713b4ca88cd1e"} Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.681076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"192191a432ac0b1256e1b88e338b6e1a385759f511a7981b7547f45f04665a60"} Mar 17 01:28:36 crc kubenswrapper[4735]: I0317 01:28:36.681084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"4ae90ad46473b9818f447de9745311bbd254ec2d00d0dcc143a31b17fda3d071"} Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.089124 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:37 crc kubenswrapper[4735]: E0317 01:28:37.089726 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2639420-3978-4b55-a81e-f1e770e09cf2" containerName="swift-ring-rebalance" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.089749 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2639420-3978-4b55-a81e-f1e770e09cf2" containerName="swift-ring-rebalance" Mar 17 01:28:37 crc kubenswrapper[4735]: E0317 01:28:37.089766 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" containerName="mariadb-account-create-update" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.089775 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" containerName="mariadb-account-create-update" Mar 17 01:28:37 crc kubenswrapper[4735]: E0317 01:28:37.089797 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c0c683-4152-4360-a49f-22a72c25ad1c" containerName="glance-db-sync" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.089804 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c0c683-4152-4360-a49f-22a72c25ad1c" containerName="glance-db-sync" Mar 17 01:28:37 crc kubenswrapper[4735]: E0317 01:28:37.089825 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c86aca-f8f9-41b4-9247-c112a39231f0" containerName="ovn-config" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.089833 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c86aca-f8f9-41b4-9247-c112a39231f0" containerName="ovn-config" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.090047 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" containerName="mariadb-account-create-update" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.090073 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c86aca-f8f9-41b4-9247-c112a39231f0" containerName="ovn-config" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.090092 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c0c683-4152-4360-a49f-22a72c25ad1c" containerName="glance-db-sync" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.090104 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2639420-3978-4b55-a81e-f1e770e09cf2" containerName="swift-ring-rebalance" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.091807 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.114474 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.217304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.217350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.217392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.217412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.217491 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqwtg\" (UniqueName: \"kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.318804 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.318842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.318892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.318912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.318964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwtg\" (UniqueName: \"kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.320140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.320732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.321220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.321322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.336984 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwtg\" (UniqueName: \"kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg\") pod \"dnsmasq-dns-79b5794765-qh4jm\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.409255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:37 crc kubenswrapper[4735]: I0317 01:28:37.755205 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:37 crc kubenswrapper[4735]: W0317 01:28:37.937137 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7922f144_fceb_493c_89ee_d1c2f5e501e6.slice/crio-3d4286d4efbcfb039cb9208d1595192e60c9655b1e2250f5a680ecb2ef3251a3 WatchSource:0}: Error finding container 3d4286d4efbcfb039cb9208d1595192e60c9655b1e2250f5a680ecb2ef3251a3: Status 404 returned error can't find the container with id 3d4286d4efbcfb039cb9208d1595192e60c9655b1e2250f5a680ecb2ef3251a3 Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.732599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"9f0351cacf09ff604a9fa0376a6cb23fa22c68c3c8ce040b62898b4609f97d8c"} Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.733000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"7807874550dfe200b53a17b29107be0218438a7a4d23580bf8165a90d2f97c70"} Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.733011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"f654a610e07abc0caae9de994597c338a2638710f53410082e2aa3a41d574a28"} Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.733020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"fdcaa4ff1f4e830c6b9e86934bd6fb7a100e6b2e06bb45a6b998ee4803bd78a7"} Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.739401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" event={"ID":"7922f144-fceb-493c-89ee-d1c2f5e501e6","Type":"ContainerDied","Data":"3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8"} Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.739479 4735 generic.go:334] "Generic (PLEG): container finished" podID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerID="3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8" exitCode=0 Mar 17 01:28:38 crc kubenswrapper[4735]: I0317 01:28:38.739509 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" event={"ID":"7922f144-fceb-493c-89ee-d1c2f5e501e6","Type":"ContainerStarted","Data":"3d4286d4efbcfb039cb9208d1595192e60c9655b1e2250f5a680ecb2ef3251a3"} Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.752781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"d1193c5ea836aa1bbefa1baceff8cb85ddd4e0f7c9db4c68e42b8838f64f3195"} Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.753085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"9475bd97a2f81c5b5f1250074785e9c7f86b1b3e3335d1c32929f2fe4be45960"} Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.753100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"630046af-37dd-4e3d-9109-c130caec8508","Type":"ContainerStarted","Data":"33a70d4b7403e2417f38c7fd4855e174b0423c567dfc02b1ffca9957167aa92b"} Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.755045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" event={"ID":"7922f144-fceb-493c-89ee-d1c2f5e501e6","Type":"ContainerStarted","Data":"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b"} Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.755355 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:39 crc kubenswrapper[4735]: I0317 01:28:39.814132 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.549243312 podStartE2EDuration="40.814110081s" podCreationTimestamp="2026-03-17 01:27:59 +0000 UTC" firstStartedPulling="2026-03-17 01:28:33.254056018 +0000 UTC m=+1138.886289006" lastFinishedPulling="2026-03-17 01:28:37.518922787 +0000 UTC m=+1143.151155775" observedRunningTime="2026-03-17 01:28:39.793590655 +0000 UTC m=+1145.425823663" watchObservedRunningTime="2026-03-17 01:28:39.814110081 +0000 UTC m=+1145.446343089" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.069593 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" podStartSLOduration=3.069576067 podStartE2EDuration="3.069576067s" podCreationTimestamp="2026-03-17 01:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:39.830397093 +0000 UTC m=+1145.462630081" watchObservedRunningTime="2026-03-17 01:28:40.069576067 +0000 UTC m=+1145.701809045" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.074197 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.114183 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.115391 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.117230 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.132499 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.182715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmp5\" (UniqueName: \"kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.182767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.182808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.182932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.182952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.183127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.284605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.284649 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.284683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.284727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmp5\" (UniqueName: \"kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.284785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.285161 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.285513 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.285572 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.285763 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.286022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.286515 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.303195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmp5\" (UniqueName: \"kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5\") pod \"dnsmasq-dns-7f9dd5d975-62wvw\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.430593 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:40 crc kubenswrapper[4735]: I0317 01:28:40.839104 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:28:41 crc kubenswrapper[4735]: I0317 01:28:41.771671 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerID="a08806e2f22728ee39b447fcdf79546d8164c13afe9d36b9914febe0696906c7" exitCode=0 Mar 17 01:28:41 crc kubenswrapper[4735]: I0317 01:28:41.771765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" event={"ID":"f9283346-2c2b-45c9-90f0-ef25dcc0a250","Type":"ContainerDied","Data":"a08806e2f22728ee39b447fcdf79546d8164c13afe9d36b9914febe0696906c7"} Mar 17 01:28:41 crc kubenswrapper[4735]: I0317 01:28:41.772164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" event={"ID":"f9283346-2c2b-45c9-90f0-ef25dcc0a250","Type":"ContainerStarted","Data":"20e1f776d0261ffaeff5ae7e4f5b73ccd48b481022a63a447a978ba84e942ab6"} Mar 17 01:28:41 crc kubenswrapper[4735]: I0317 01:28:41.772938 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="dnsmasq-dns" containerID="cri-o://f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b" gracePeriod=10 Mar 17 01:28:41 crc kubenswrapper[4735]: I0317 01:28:41.970249 4735 scope.go:117] "RemoveContainer" containerID="625dba166345dcf9adcc326bf6914bd654a08fbe8db12eb808ece852ba6163fc" Mar 17 01:28:42 crc kubenswrapper[4735]: E0317 01:28:42.015571 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7922f144_fceb_493c_89ee_d1c2f5e501e6.slice/crio-conmon-f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7922f144_fceb_493c_89ee_d1c2f5e501e6.slice/crio-f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.182940 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.335411 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb\") pod \"7922f144-fceb-493c-89ee-d1c2f5e501e6\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.335562 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb\") pod \"7922f144-fceb-493c-89ee-d1c2f5e501e6\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.335604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config\") pod \"7922f144-fceb-493c-89ee-d1c2f5e501e6\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.335635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc\") pod \"7922f144-fceb-493c-89ee-d1c2f5e501e6\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.335679 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqwtg\" (UniqueName: \"kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg\") pod \"7922f144-fceb-493c-89ee-d1c2f5e501e6\" (UID: \"7922f144-fceb-493c-89ee-d1c2f5e501e6\") " Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.354681 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg" (OuterVolumeSpecName: "kube-api-access-kqwtg") pod "7922f144-fceb-493c-89ee-d1c2f5e501e6" (UID: "7922f144-fceb-493c-89ee-d1c2f5e501e6"). InnerVolumeSpecName "kube-api-access-kqwtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.390631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7922f144-fceb-493c-89ee-d1c2f5e501e6" (UID: "7922f144-fceb-493c-89ee-d1c2f5e501e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.414525 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config" (OuterVolumeSpecName: "config") pod "7922f144-fceb-493c-89ee-d1c2f5e501e6" (UID: "7922f144-fceb-493c-89ee-d1c2f5e501e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.418739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7922f144-fceb-493c-89ee-d1c2f5e501e6" (UID: "7922f144-fceb-493c-89ee-d1c2f5e501e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.438818 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.438840 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.438848 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.438869 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqwtg\" (UniqueName: \"kubernetes.io/projected/7922f144-fceb-493c-89ee-d1c2f5e501e6-kube-api-access-kqwtg\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.441523 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7922f144-fceb-493c-89ee-d1c2f5e501e6" (UID: "7922f144-fceb-493c-89ee-d1c2f5e501e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.541169 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7922f144-fceb-493c-89ee-d1c2f5e501e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.606646 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.606759 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.790220 4735 generic.go:334] "Generic (PLEG): container finished" podID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerID="f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b" exitCode=0 Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.790293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" event={"ID":"7922f144-fceb-493c-89ee-d1c2f5e501e6","Type":"ContainerDied","Data":"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b"} Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.790368 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.790931 4735 scope.go:117] "RemoveContainer" containerID="f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.791079 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5794765-qh4jm" event={"ID":"7922f144-fceb-493c-89ee-d1c2f5e501e6","Type":"ContainerDied","Data":"3d4286d4efbcfb039cb9208d1595192e60c9655b1e2250f5a680ecb2ef3251a3"} Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.797354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" event={"ID":"f9283346-2c2b-45c9-90f0-ef25dcc0a250","Type":"ContainerStarted","Data":"360cd0c8d15605e0761da4bb5a065d5ffb02cf915d548deb946f2e90553c77f1"} Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.797750 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.832337 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" podStartSLOduration=2.832307352 podStartE2EDuration="2.832307352s" podCreationTimestamp="2026-03-17 01:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:42.831188095 +0000 UTC m=+1148.463421113" watchObservedRunningTime="2026-03-17 01:28:42.832307352 +0000 UTC m=+1148.464540330" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.838342 4735 scope.go:117] "RemoveContainer" containerID="3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.857036 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.873263 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5794765-qh4jm"] Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.890747 4735 scope.go:117] "RemoveContainer" containerID="f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b" Mar 17 01:28:42 crc kubenswrapper[4735]: E0317 01:28:42.891327 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b\": container with ID starting with f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b not found: ID does not exist" containerID="f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.891374 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b"} err="failed to get container status \"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b\": rpc error: code = NotFound desc = could not find container \"f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b\": container with ID starting with f68079faef70558d19e9904a3dbe90afd18757bffca3cc6a0c46daa9c83f147b not found: ID does not exist" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.891399 4735 scope.go:117] "RemoveContainer" containerID="3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8" Mar 17 01:28:42 crc kubenswrapper[4735]: E0317 01:28:42.891849 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8\": container with ID starting with 3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8 not found: ID does not exist" containerID="3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8" Mar 17 01:28:42 crc kubenswrapper[4735]: I0317 01:28:42.891941 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8"} err="failed to get container status \"3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8\": rpc error: code = NotFound desc = could not find container \"3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8\": container with ID starting with 3611c818b7e79d03d6157904ad3496466f4e1d10e0c72bf845291e9a8c0bdda8 not found: ID does not exist" Mar 17 01:28:43 crc kubenswrapper[4735]: I0317 01:28:43.088695 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" path="/var/lib/kubelet/pods/7922f144-fceb-493c-89ee-d1c2f5e501e6/volumes" Mar 17 01:28:44 crc kubenswrapper[4735]: I0317 01:28:44.226233 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:28:44 crc kubenswrapper[4735]: I0317 01:28:44.783128 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.063735 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-htvtb"] Mar 17 01:28:46 crc kubenswrapper[4735]: E0317 01:28:46.064292 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="dnsmasq-dns" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.064304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="dnsmasq-dns" Mar 17 01:28:46 crc kubenswrapper[4735]: E0317 01:28:46.064315 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="init" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.064323 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="init" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.064492 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7922f144-fceb-493c-89ee-d1c2f5e501e6" containerName="dnsmasq-dns" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.064981 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.077644 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-htvtb"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.114283 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.114379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgl4v\" (UniqueName: \"kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.176438 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-56e5-account-create-update-fjdzb"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.177434 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.187903 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.197764 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-56e5-account-create-update-fjdzb"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.215568 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.215633 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6wr\" (UniqueName: \"kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.215720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.216396 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgl4v\" (UniqueName: \"kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.216308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.239877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgl4v\" (UniqueName: \"kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v\") pod \"cinder-db-create-htvtb\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.281682 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-8rx27"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.282576 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.294172 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8rx27"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.327041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.327114 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6wr\" (UniqueName: \"kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.328060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.348374 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6wr\" (UniqueName: \"kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr\") pod \"cinder-56e5-account-create-update-fjdzb\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.379472 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.402154 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-bf9f-account-create-update-6plhb"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.403386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.407214 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.423303 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bf9f-account-create-update-6plhb"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.428059 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.429139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqnsr\" (UniqueName: \"kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.490642 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.532888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vpx\" (UniqueName: \"kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.532934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.532993 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqnsr\" (UniqueName: \"kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.533034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.533731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.580595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqnsr\" (UniqueName: \"kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr\") pod \"heat-db-create-8rx27\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.594580 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hx4sl"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.600541 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.601320 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8rx27" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.602809 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p2vmn" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.603102 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.603331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.603431 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.632459 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wgc9q"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.633757 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.634436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.634477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.634542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhw29\" (UniqueName: \"kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.634568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vpx\" (UniqueName: \"kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.634587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.635327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.690657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hx4sl"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.694707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vpx\" (UniqueName: \"kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx\") pod \"heat-bf9f-account-create-update-6plhb\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.732006 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wgc9q"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.735276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp4x\" (UniqueName: \"kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.735345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.735393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.735424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhw29\" (UniqueName: \"kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.735454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.739637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.747302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.762535 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9e40-account-create-update-jcwj9"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.763630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.771103 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hbfv6"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.771155 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.772282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.785450 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9e40-account-create-update-jcwj9"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.788687 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.797964 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hbfv6"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.798741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhw29\" (UniqueName: \"kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29\") pod \"keystone-db-sync-hx4sl\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.823274 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9b09-account-create-update-mtbq2"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.824235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.827085 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hxl\" (UniqueName: \"kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gtkt\" (UniqueName: \"kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836490 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp4x\" (UniqueName: \"kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836632 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.836693 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqmk\" (UniqueName: \"kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.837293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.879355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp4x\" (UniqueName: \"kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x\") pod \"barbican-db-create-wgc9q\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.884080 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9b09-account-create-update-mtbq2"] Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqmk\" (UniqueName: \"kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hxl\" (UniqueName: \"kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.937656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gtkt\" (UniqueName: \"kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.938614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.939196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.941217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.955464 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.957801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqmk\" (UniqueName: \"kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk\") pod \"neutron-db-create-hbfv6\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.958608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hxl\" (UniqueName: \"kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl\") pod \"barbican-9b09-account-create-update-mtbq2\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.966301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gtkt\" (UniqueName: \"kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt\") pod \"neutron-9e40-account-create-update-jcwj9\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:46 crc kubenswrapper[4735]: I0317 01:28:46.970813 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.105488 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-htvtb"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.107729 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.136311 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.182680 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.219190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-56e5-account-create-update-fjdzb"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.491626 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8rx27"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.599727 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bf9f-account-create-update-6plhb"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.729973 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hx4sl"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.833162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wgc9q"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.878769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bf9f-account-create-update-6plhb" event={"ID":"f9050259-4c72-4b29-8025-7ad50bc08910","Type":"ContainerStarted","Data":"ff56281ef4b0b95c01f835755da81017b6dd4acdc78a426454e566119f4fd5ea"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.881894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hx4sl" event={"ID":"706bfd83-931b-46f6-9c7a-aa4e915cb054","Type":"ContainerStarted","Data":"7cec545524acfe86d2916263d0787b7a01756c3f89902f595ba4824efdfb518d"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.892907 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hbfv6"] Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.894311 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-56e5-account-create-update-fjdzb" event={"ID":"a84207b1-d950-43ac-b9b9-315e32f3abce","Type":"ContainerStarted","Data":"751d91159dd856b28724425e05a5ff3e0cbd5e33dea057cbb77b9cb377b92956"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.894355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-56e5-account-create-update-fjdzb" event={"ID":"a84207b1-d950-43ac-b9b9-315e32f3abce","Type":"ContainerStarted","Data":"463165564f3fc3ff05e25ce24fc288b8713b31e6d7cd91c510434c0a1f68926d"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.903088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8rx27" event={"ID":"988c6c3e-17e6-48d3-8428-3103523980a0","Type":"ContainerStarted","Data":"a8a0cc62111ac4e06a0b07e257e42297a00b27ee6e767cff272900aa14c833f6"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.910707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-htvtb" event={"ID":"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5","Type":"ContainerStarted","Data":"c0ffda15eb08aa3552865f5ffa0080fdcaaf66a3d9f174cff1495bb9f782d27a"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.910757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-htvtb" event={"ID":"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5","Type":"ContainerStarted","Data":"69d477f52151f0efd6fae325a54cf03052eea1756786e24c20f7f3db8ddb38ac"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.921498 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wgc9q" event={"ID":"6abe75a9-68d1-4233-a036-93f692c82544","Type":"ContainerStarted","Data":"160b1cb1cfa37d69d3b45cd015d7d23cdee1bf439530874540b9cabea8abda6c"} Mar 17 01:28:47 crc kubenswrapper[4735]: I0317 01:28:47.924070 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-56e5-account-create-update-fjdzb" podStartSLOduration=1.924059625 podStartE2EDuration="1.924059625s" podCreationTimestamp="2026-03-17 01:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:47.91575369 +0000 UTC m=+1153.547986668" watchObservedRunningTime="2026-03-17 01:28:47.924059625 +0000 UTC m=+1153.556292603" Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.297650 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-htvtb" podStartSLOduration=2.29762809 podStartE2EDuration="2.29762809s" podCreationTimestamp="2026-03-17 01:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:28:47.939089896 +0000 UTC m=+1153.571322874" watchObservedRunningTime="2026-03-17 01:28:48.29762809 +0000 UTC m=+1153.929861068" Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.298122 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9e40-account-create-update-jcwj9"] Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.311069 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9b09-account-create-update-mtbq2"] Mar 17 01:28:48 crc kubenswrapper[4735]: W0317 01:28:48.323306 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod374187e8_9018_4baa_a078_1f80c9b4f0ff.slice/crio-c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e WatchSource:0}: Error finding container c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e: Status 404 returned error can't find the container with id c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e Mar 17 01:28:48 crc kubenswrapper[4735]: W0317 01:28:48.327307 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97d71250_2369_4916_a9ba_1e5ecaa7ac76.slice/crio-9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae WatchSource:0}: Error finding container 9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae: Status 404 returned error can't find the container with id 9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.929585 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9050259-4c72-4b29-8025-7ad50bc08910" containerID="f1cf770ec3fdf44bba1dac1d20b6167c2c98c268035cf093e4b74cf7bdc4d92a" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.929791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bf9f-account-create-update-6plhb" event={"ID":"f9050259-4c72-4b29-8025-7ad50bc08910","Type":"ContainerDied","Data":"f1cf770ec3fdf44bba1dac1d20b6167c2c98c268035cf093e4b74cf7bdc4d92a"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.932315 4735 generic.go:334] "Generic (PLEG): container finished" podID="a84207b1-d950-43ac-b9b9-315e32f3abce" containerID="751d91159dd856b28724425e05a5ff3e0cbd5e33dea057cbb77b9cb377b92956" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.932413 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-56e5-account-create-update-fjdzb" event={"ID":"a84207b1-d950-43ac-b9b9-315e32f3abce","Type":"ContainerDied","Data":"751d91159dd856b28724425e05a5ff3e0cbd5e33dea057cbb77b9cb377b92956"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.934367 4735 generic.go:334] "Generic (PLEG): container finished" podID="988c6c3e-17e6-48d3-8428-3103523980a0" containerID="062f033a7acde18e561f1b7cf13d5bffd51a2a1b919eca7ca1931179edbc6021" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.934406 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8rx27" event={"ID":"988c6c3e-17e6-48d3-8428-3103523980a0","Type":"ContainerDied","Data":"062f033a7acde18e561f1b7cf13d5bffd51a2a1b919eca7ca1931179edbc6021"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.935598 4735 generic.go:334] "Generic (PLEG): container finished" podID="97d71250-2369-4916-a9ba-1e5ecaa7ac76" containerID="577c06385ed64e451c78f2cf0eb3b6cc37f1e737f89318e8008e3cab095ecd05" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.935634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9e40-account-create-update-jcwj9" event={"ID":"97d71250-2369-4916-a9ba-1e5ecaa7ac76","Type":"ContainerDied","Data":"577c06385ed64e451c78f2cf0eb3b6cc37f1e737f89318e8008e3cab095ecd05"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.935648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9e40-account-create-update-jcwj9" event={"ID":"97d71250-2369-4916-a9ba-1e5ecaa7ac76","Type":"ContainerStarted","Data":"9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.938182 4735 generic.go:334] "Generic (PLEG): container finished" podID="a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" containerID="c0ffda15eb08aa3552865f5ffa0080fdcaaf66a3d9f174cff1495bb9f782d27a" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.938272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-htvtb" event={"ID":"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5","Type":"ContainerDied","Data":"c0ffda15eb08aa3552865f5ffa0080fdcaaf66a3d9f174cff1495bb9f782d27a"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.939452 4735 generic.go:334] "Generic (PLEG): container finished" podID="6abe75a9-68d1-4233-a036-93f692c82544" containerID="0db7921a6390bef1e3f7459cef41cdf57a32b61369cbce149dd52ac72d3b329f" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.939492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wgc9q" event={"ID":"6abe75a9-68d1-4233-a036-93f692c82544","Type":"ContainerDied","Data":"0db7921a6390bef1e3f7459cef41cdf57a32b61369cbce149dd52ac72d3b329f"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.945463 4735 generic.go:334] "Generic (PLEG): container finished" podID="7e99284f-9b1b-408c-8d81-9562bcf5a449" containerID="89af35daf3119b143b4d5ae39f194c8902d0d3bd76e39004d16929e78961c34e" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.945544 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hbfv6" event={"ID":"7e99284f-9b1b-408c-8d81-9562bcf5a449","Type":"ContainerDied","Data":"89af35daf3119b143b4d5ae39f194c8902d0d3bd76e39004d16929e78961c34e"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.945573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hbfv6" event={"ID":"7e99284f-9b1b-408c-8d81-9562bcf5a449","Type":"ContainerStarted","Data":"b0ca9c1dc0644078d8f4b13b019d9b5f69cea621cf193f1743d9673f47b5a2bb"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.949916 4735 generic.go:334] "Generic (PLEG): container finished" podID="374187e8-9018-4baa-a078-1f80c9b4f0ff" containerID="f74c1098ad49626659407059a2ec2ac6ecaf5ba8ef5c9460edc720b413828675" exitCode=0 Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.949955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b09-account-create-update-mtbq2" event={"ID":"374187e8-9018-4baa-a078-1f80c9b4f0ff","Type":"ContainerDied","Data":"f74c1098ad49626659407059a2ec2ac6ecaf5ba8ef5c9460edc720b413828675"} Mar 17 01:28:48 crc kubenswrapper[4735]: I0317 01:28:48.950002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b09-account-create-update-mtbq2" event={"ID":"374187e8-9018-4baa-a078-1f80c9b4f0ff","Type":"ContainerStarted","Data":"c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e"} Mar 17 01:28:50 crc kubenswrapper[4735]: I0317 01:28:50.433224 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:28:50 crc kubenswrapper[4735]: I0317 01:28:50.512273 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:28:50 crc kubenswrapper[4735]: I0317 01:28:50.516138 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="dnsmasq-dns" containerID="cri-o://eb842bb48868aa1b7162a7c93662eff6b6860d3187ab9c2594f0d1da0ab24dbb" gracePeriod=10 Mar 17 01:28:50 crc kubenswrapper[4735]: I0317 01:28:50.993264 4735 generic.go:334] "Generic (PLEG): container finished" podID="b7541744-776e-4960-8eaf-039cc5489054" containerID="eb842bb48868aa1b7162a7c93662eff6b6860d3187ab9c2594f0d1da0ab24dbb" exitCode=0 Mar 17 01:28:50 crc kubenswrapper[4735]: I0317 01:28:50.993310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" event={"ID":"b7541744-776e-4960-8eaf-039cc5489054","Type":"ContainerDied","Data":"eb842bb48868aa1b7162a7c93662eff6b6860d3187ab9c2594f0d1da0ab24dbb"} Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.263079 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.282454 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.339770 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.343435 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.351003 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.397206 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgl4v\" (UniqueName: \"kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v\") pod \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.397287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts\") pod \"a84207b1-d950-43ac-b9b9-315e32f3abce\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.397397 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6wr\" (UniqueName: \"kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr\") pod \"a84207b1-d950-43ac-b9b9-315e32f3abce\" (UID: \"a84207b1-d950-43ac-b9b9-315e32f3abce\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.397417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts\") pod \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\" (UID: \"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.398205 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a84207b1-d950-43ac-b9b9-315e32f3abce" (UID: "a84207b1-d950-43ac-b9b9-315e32f3abce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.398888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" (UID: "a55338f9-1081-4b02-b2a5-bddfd3d1f8c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.407774 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr" (OuterVolumeSpecName: "kube-api-access-nw6wr") pod "a84207b1-d950-43ac-b9b9-315e32f3abce" (UID: "a84207b1-d950-43ac-b9b9-315e32f3abce"). InnerVolumeSpecName "kube-api-access-nw6wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.420964 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v" (OuterVolumeSpecName: "kube-api-access-vgl4v") pod "a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" (UID: "a55338f9-1081-4b02-b2a5-bddfd3d1f8c5"). InnerVolumeSpecName "kube-api-access-vgl4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.454550 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8rx27" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.464219 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.493346 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.498079 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.498933 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94hxl\" (UniqueName: \"kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl\") pod \"374187e8-9018-4baa-a078-1f80c9b4f0ff\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.499037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts\") pod \"f9050259-4c72-4b29-8025-7ad50bc08910\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.499145 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts\") pod \"374187e8-9018-4baa-a078-1f80c9b4f0ff\" (UID: \"374187e8-9018-4baa-a078-1f80c9b4f0ff\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.499287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjqmk\" (UniqueName: \"kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk\") pod \"7e99284f-9b1b-408c-8d81-9562bcf5a449\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.499477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2vpx\" (UniqueName: \"kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx\") pod \"f9050259-4c72-4b29-8025-7ad50bc08910\" (UID: \"f9050259-4c72-4b29-8025-7ad50bc08910\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.499593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts\") pod \"7e99284f-9b1b-408c-8d81-9562bcf5a449\" (UID: \"7e99284f-9b1b-408c-8d81-9562bcf5a449\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.501912 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9050259-4c72-4b29-8025-7ad50bc08910" (UID: "f9050259-4c72-4b29-8025-7ad50bc08910"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.505197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl" (OuterVolumeSpecName: "kube-api-access-94hxl") pod "374187e8-9018-4baa-a078-1f80c9b4f0ff" (UID: "374187e8-9018-4baa-a078-1f80c9b4f0ff"). InnerVolumeSpecName "kube-api-access-94hxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.506008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "374187e8-9018-4baa-a078-1f80c9b4f0ff" (UID: "374187e8-9018-4baa-a078-1f80c9b4f0ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.506065 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk" (OuterVolumeSpecName: "kube-api-access-qjqmk") pod "7e99284f-9b1b-408c-8d81-9562bcf5a449" (UID: "7e99284f-9b1b-408c-8d81-9562bcf5a449"). InnerVolumeSpecName "kube-api-access-qjqmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.507290 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9050259-4c72-4b29-8025-7ad50bc08910-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508398 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94hxl\" (UniqueName: \"kubernetes.io/projected/374187e8-9018-4baa-a078-1f80c9b4f0ff-kube-api-access-94hxl\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508484 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/374187e8-9018-4baa-a078-1f80c9b4f0ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508563 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgl4v\" (UniqueName: \"kubernetes.io/projected/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-kube-api-access-vgl4v\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508630 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjqmk\" (UniqueName: \"kubernetes.io/projected/7e99284f-9b1b-408c-8d81-9562bcf5a449-kube-api-access-qjqmk\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508704 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a84207b1-d950-43ac-b9b9-315e32f3abce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508774 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6wr\" (UniqueName: \"kubernetes.io/projected/a84207b1-d950-43ac-b9b9-315e32f3abce-kube-api-access-nw6wr\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.508840 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.507764 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e99284f-9b1b-408c-8d81-9562bcf5a449" (UID: "7e99284f-9b1b-408c-8d81-9562bcf5a449"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.510600 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx" (OuterVolumeSpecName: "kube-api-access-b2vpx") pod "f9050259-4c72-4b29-8025-7ad50bc08910" (UID: "f9050259-4c72-4b29-8025-7ad50bc08910"). InnerVolumeSpecName "kube-api-access-b2vpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts\") pod \"988c6c3e-17e6-48d3-8428-3103523980a0\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb\") pod \"b7541744-776e-4960-8eaf-039cc5489054\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config\") pod \"b7541744-776e-4960-8eaf-039cc5489054\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611692 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts\") pod \"6abe75a9-68d1-4233-a036-93f692c82544\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts\") pod \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611734 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc\") pod \"b7541744-776e-4960-8eaf-039cc5489054\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611772 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhp4x\" (UniqueName: \"kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x\") pod \"6abe75a9-68d1-4233-a036-93f692c82544\" (UID: \"6abe75a9-68d1-4233-a036-93f692c82544\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611792 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gtkt\" (UniqueName: \"kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt\") pod \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\" (UID: \"97d71250-2369-4916-a9ba-1e5ecaa7ac76\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqnsr\" (UniqueName: \"kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr\") pod \"988c6c3e-17e6-48d3-8428-3103523980a0\" (UID: \"988c6c3e-17e6-48d3-8428-3103523980a0\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47blw\" (UniqueName: \"kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw\") pod \"b7541744-776e-4960-8eaf-039cc5489054\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.611927 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb\") pod \"b7541744-776e-4960-8eaf-039cc5489054\" (UID: \"b7541744-776e-4960-8eaf-039cc5489054\") " Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.612083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "988c6c3e-17e6-48d3-8428-3103523980a0" (UID: "988c6c3e-17e6-48d3-8428-3103523980a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.612710 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/988c6c3e-17e6-48d3-8428-3103523980a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.612736 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2vpx\" (UniqueName: \"kubernetes.io/projected/f9050259-4c72-4b29-8025-7ad50bc08910-kube-api-access-b2vpx\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.612752 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e99284f-9b1b-408c-8d81-9562bcf5a449-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.615755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt" (OuterVolumeSpecName: "kube-api-access-8gtkt") pod "97d71250-2369-4916-a9ba-1e5ecaa7ac76" (UID: "97d71250-2369-4916-a9ba-1e5ecaa7ac76"). InnerVolumeSpecName "kube-api-access-8gtkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.616575 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6abe75a9-68d1-4233-a036-93f692c82544" (UID: "6abe75a9-68d1-4233-a036-93f692c82544"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.616656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw" (OuterVolumeSpecName: "kube-api-access-47blw") pod "b7541744-776e-4960-8eaf-039cc5489054" (UID: "b7541744-776e-4960-8eaf-039cc5489054"). InnerVolumeSpecName "kube-api-access-47blw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.617034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97d71250-2369-4916-a9ba-1e5ecaa7ac76" (UID: "97d71250-2369-4916-a9ba-1e5ecaa7ac76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.617171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr" (OuterVolumeSpecName: "kube-api-access-wqnsr") pod "988c6c3e-17e6-48d3-8428-3103523980a0" (UID: "988c6c3e-17e6-48d3-8428-3103523980a0"). InnerVolumeSpecName "kube-api-access-wqnsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.618273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x" (OuterVolumeSpecName: "kube-api-access-vhp4x") pod "6abe75a9-68d1-4233-a036-93f692c82544" (UID: "6abe75a9-68d1-4233-a036-93f692c82544"). InnerVolumeSpecName "kube-api-access-vhp4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.646867 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7541744-776e-4960-8eaf-039cc5489054" (UID: "b7541744-776e-4960-8eaf-039cc5489054"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.647501 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7541744-776e-4960-8eaf-039cc5489054" (UID: "b7541744-776e-4960-8eaf-039cc5489054"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.647519 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7541744-776e-4960-8eaf-039cc5489054" (UID: "b7541744-776e-4960-8eaf-039cc5489054"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.664051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config" (OuterVolumeSpecName: "config") pod "b7541744-776e-4960-8eaf-039cc5489054" (UID: "b7541744-776e-4960-8eaf-039cc5489054"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714190 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqnsr\" (UniqueName: \"kubernetes.io/projected/988c6c3e-17e6-48d3-8428-3103523980a0-kube-api-access-wqnsr\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714227 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47blw\" (UniqueName: \"kubernetes.io/projected/b7541744-776e-4960-8eaf-039cc5489054-kube-api-access-47blw\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714237 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714246 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714254 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714263 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abe75a9-68d1-4233-a036-93f692c82544-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714270 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97d71250-2369-4916-a9ba-1e5ecaa7ac76-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714279 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7541744-776e-4960-8eaf-039cc5489054-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714288 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gtkt\" (UniqueName: \"kubernetes.io/projected/97d71250-2369-4916-a9ba-1e5ecaa7ac76-kube-api-access-8gtkt\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:53 crc kubenswrapper[4735]: I0317 01:28:53.714296 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhp4x\" (UniqueName: \"kubernetes.io/projected/6abe75a9-68d1-4233-a036-93f692c82544-kube-api-access-vhp4x\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.023324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8rx27" event={"ID":"988c6c3e-17e6-48d3-8428-3103523980a0","Type":"ContainerDied","Data":"a8a0cc62111ac4e06a0b07e257e42297a00b27ee6e767cff272900aa14c833f6"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.023659 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a0cc62111ac4e06a0b07e257e42297a00b27ee6e767cff272900aa14c833f6" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.023341 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8rx27" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.025771 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-htvtb" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.026475 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-htvtb" event={"ID":"a55338f9-1081-4b02-b2a5-bddfd3d1f8c5","Type":"ContainerDied","Data":"69d477f52151f0efd6fae325a54cf03052eea1756786e24c20f7f3db8ddb38ac"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.026526 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69d477f52151f0efd6fae325a54cf03052eea1756786e24c20f7f3db8ddb38ac" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.029041 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wgc9q" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.029053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wgc9q" event={"ID":"6abe75a9-68d1-4233-a036-93f692c82544","Type":"ContainerDied","Data":"160b1cb1cfa37d69d3b45cd015d7d23cdee1bf439530874540b9cabea8abda6c"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.029379 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160b1cb1cfa37d69d3b45cd015d7d23cdee1bf439530874540b9cabea8abda6c" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.031452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hbfv6" event={"ID":"7e99284f-9b1b-408c-8d81-9562bcf5a449","Type":"ContainerDied","Data":"b0ca9c1dc0644078d8f4b13b019d9b5f69cea621cf193f1743d9673f47b5a2bb"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.031473 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hbfv6" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.031484 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ca9c1dc0644078d8f4b13b019d9b5f69cea621cf193f1743d9673f47b5a2bb" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.033470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b09-account-create-update-mtbq2" event={"ID":"374187e8-9018-4baa-a078-1f80c9b4f0ff","Type":"ContainerDied","Data":"c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.033497 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9905f70596f0c59d191060319eb9ee6d8460aced0bdb44274d6654edbe0d52e" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.033550 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b09-account-create-update-mtbq2" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.036838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bf9f-account-create-update-6plhb" event={"ID":"f9050259-4c72-4b29-8025-7ad50bc08910","Type":"ContainerDied","Data":"ff56281ef4b0b95c01f835755da81017b6dd4acdc78a426454e566119f4fd5ea"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.036891 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff56281ef4b0b95c01f835755da81017b6dd4acdc78a426454e566119f4fd5ea" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.036955 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bf9f-account-create-update-6plhb" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.041576 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hx4sl" event={"ID":"706bfd83-931b-46f6-9c7a-aa4e915cb054","Type":"ContainerStarted","Data":"b36ed64453ea7406e638f1eb444575a2fa6027ef986a26c950e8b304f397f2f6"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.046091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-56e5-account-create-update-fjdzb" event={"ID":"a84207b1-d950-43ac-b9b9-315e32f3abce","Type":"ContainerDied","Data":"463165564f3fc3ff05e25ce24fc288b8713b31e6d7cd91c510434c0a1f68926d"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.046138 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463165564f3fc3ff05e25ce24fc288b8713b31e6d7cd91c510434c0a1f68926d" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.046342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-56e5-account-create-update-fjdzb" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.051849 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9e40-account-create-update-jcwj9" event={"ID":"97d71250-2369-4916-a9ba-1e5ecaa7ac76","Type":"ContainerDied","Data":"9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.051918 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbf8d73a07cdb348aa47338c522ee4170a5c7d6d402c126f18686ea620556ae" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.051996 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9e40-account-create-update-jcwj9" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.054747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" event={"ID":"b7541744-776e-4960-8eaf-039cc5489054","Type":"ContainerDied","Data":"4f1ff03b3ee65d1d7cf197d27a3a10922994b066801b21bc6be5a7d234f0d89c"} Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.054829 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8c8d4885-pj7bf" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.054851 4735 scope.go:117] "RemoveContainer" containerID="eb842bb48868aa1b7162a7c93662eff6b6860d3187ab9c2594f0d1da0ab24dbb" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.068583 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hx4sl" podStartSLOduration=2.677081231 podStartE2EDuration="8.0685538s" podCreationTimestamp="2026-03-17 01:28:46 +0000 UTC" firstStartedPulling="2026-03-17 01:28:47.783245506 +0000 UTC m=+1153.415478484" lastFinishedPulling="2026-03-17 01:28:53.174718075 +0000 UTC m=+1158.806951053" observedRunningTime="2026-03-17 01:28:54.067822652 +0000 UTC m=+1159.700055640" watchObservedRunningTime="2026-03-17 01:28:54.0685538 +0000 UTC m=+1159.700786808" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.102286 4735 scope.go:117] "RemoveContainer" containerID="9fc0178fa3830d69abf382ab8624df041a2d63aad099493eb0e79ae3543559a1" Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.264432 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:28:54 crc kubenswrapper[4735]: I0317 01:28:54.272167 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8c8d4885-pj7bf"] Mar 17 01:28:55 crc kubenswrapper[4735]: I0317 01:28:55.090942 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7541744-776e-4960-8eaf-039cc5489054" path="/var/lib/kubelet/pods/b7541744-776e-4960-8eaf-039cc5489054/volumes" Mar 17 01:28:57 crc kubenswrapper[4735]: I0317 01:28:57.089539 4735 generic.go:334] "Generic (PLEG): container finished" podID="706bfd83-931b-46f6-9c7a-aa4e915cb054" containerID="b36ed64453ea7406e638f1eb444575a2fa6027ef986a26c950e8b304f397f2f6" exitCode=0 Mar 17 01:28:57 crc kubenswrapper[4735]: I0317 01:28:57.090253 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hx4sl" event={"ID":"706bfd83-931b-46f6-9c7a-aa4e915cb054","Type":"ContainerDied","Data":"b36ed64453ea7406e638f1eb444575a2fa6027ef986a26c950e8b304f397f2f6"} Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.636150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.744922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle\") pod \"706bfd83-931b-46f6-9c7a-aa4e915cb054\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.745131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data\") pod \"706bfd83-931b-46f6-9c7a-aa4e915cb054\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.745159 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhw29\" (UniqueName: \"kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29\") pod \"706bfd83-931b-46f6-9c7a-aa4e915cb054\" (UID: \"706bfd83-931b-46f6-9c7a-aa4e915cb054\") " Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.750178 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29" (OuterVolumeSpecName: "kube-api-access-qhw29") pod "706bfd83-931b-46f6-9c7a-aa4e915cb054" (UID: "706bfd83-931b-46f6-9c7a-aa4e915cb054"). InnerVolumeSpecName "kube-api-access-qhw29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.780112 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706bfd83-931b-46f6-9c7a-aa4e915cb054" (UID: "706bfd83-931b-46f6-9c7a-aa4e915cb054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.785887 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data" (OuterVolumeSpecName: "config-data") pod "706bfd83-931b-46f6-9c7a-aa4e915cb054" (UID: "706bfd83-931b-46f6-9c7a-aa4e915cb054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.846438 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.846461 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706bfd83-931b-46f6-9c7a-aa4e915cb054-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:58 crc kubenswrapper[4735]: I0317 01:28:58.846470 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhw29\" (UniqueName: \"kubernetes.io/projected/706bfd83-931b-46f6-9c7a-aa4e915cb054-kube-api-access-qhw29\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.110925 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hx4sl" event={"ID":"706bfd83-931b-46f6-9c7a-aa4e915cb054","Type":"ContainerDied","Data":"7cec545524acfe86d2916263d0787b7a01756c3f89902f595ba4824efdfb518d"} Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.110966 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hx4sl" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.110974 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cec545524acfe86d2916263d0787b7a01756c3f89902f595ba4824efdfb518d" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.359644 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360030 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d71250-2369-4916-a9ba-1e5ecaa7ac76" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360047 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d71250-2369-4916-a9ba-1e5ecaa7ac76" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360062 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706bfd83-931b-46f6-9c7a-aa4e915cb054" containerName="keystone-db-sync" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360069 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="706bfd83-931b-46f6-9c7a-aa4e915cb054" containerName="keystone-db-sync" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360081 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84207b1-d950-43ac-b9b9-315e32f3abce" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360088 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84207b1-d950-43ac-b9b9-315e32f3abce" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="init" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360103 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="init" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360115 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e99284f-9b1b-408c-8d81-9562bcf5a449" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360121 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e99284f-9b1b-408c-8d81-9562bcf5a449" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360131 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360137 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360145 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988c6c3e-17e6-48d3-8428-3103523980a0" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360151 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="988c6c3e-17e6-48d3-8428-3103523980a0" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360157 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="dnsmasq-dns" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360164 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="dnsmasq-dns" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360177 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374187e8-9018-4baa-a078-1f80c9b4f0ff" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360184 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="374187e8-9018-4baa-a078-1f80c9b4f0ff" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360199 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abe75a9-68d1-4233-a036-93f692c82544" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360206 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abe75a9-68d1-4233-a036-93f692c82544" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: E0317 01:28:59.360215 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9050259-4c72-4b29-8025-7ad50bc08910" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360220 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9050259-4c72-4b29-8025-7ad50bc08910" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360364 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="374187e8-9018-4baa-a078-1f80c9b4f0ff" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360378 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d71250-2369-4916-a9ba-1e5ecaa7ac76" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360390 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6abe75a9-68d1-4233-a036-93f692c82544" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360396 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7541744-776e-4960-8eaf-039cc5489054" containerName="dnsmasq-dns" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360406 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="706bfd83-931b-46f6-9c7a-aa4e915cb054" containerName="keystone-db-sync" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360416 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84207b1-d950-43ac-b9b9-315e32f3abce" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360423 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="988c6c3e-17e6-48d3-8428-3103523980a0" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360433 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9050259-4c72-4b29-8025-7ad50bc08910" containerName="mariadb-account-create-update" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360444 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.360459 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e99284f-9b1b-408c-8d81-9562bcf5a449" containerName="mariadb-database-create" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.361253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.405198 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.430756 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7wlrb"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.431676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.437368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.437454 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p2vmn" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.437625 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.437641 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.437760 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.454806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.454851 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpv8g\" (UniqueName: \"kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.454903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.454922 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.454982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llx7\" (UniqueName: \"kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455263 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.455348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.472148 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7wlrb"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpv8g\" (UniqueName: \"kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556477 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6llx7\" (UniqueName: \"kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556641 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.556665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.557436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.557512 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.557602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.557912 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.558188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.561828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.562258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.567012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.567504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.595846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.599438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llx7\" (UniqueName: \"kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7\") pod \"keystone-bootstrap-7wlrb\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.602586 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpv8g\" (UniqueName: \"kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g\") pod \"dnsmasq-dns-57b968d4c5-qlg46\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.656551 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-97m76"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.657586 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.666337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vsg75" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.668733 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.690381 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.701883 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.703149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.708171 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.709276 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.709578 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-s7t76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.709754 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.714289 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-97m76"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.753564 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.759237 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.759909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn65f\" (UniqueName: \"kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.759957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96m7q\" (UniqueName: \"kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.759983 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.760006 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.760031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.760057 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.760086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.760100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.791401 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-6v4rh"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.792340 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.800248 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.800387 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zdzns" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.811060 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.831923 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6v4rh"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn65f\" (UniqueName: \"kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96m7q\" (UniqueName: \"kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860965 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.860992 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.861020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj9hv\" (UniqueName: \"kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.862443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.862840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.865687 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.866194 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.866565 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.869244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.921546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn65f\" (UniqueName: \"kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f\") pod \"horizon-55b969888f-hbb2b\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.928101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96m7q\" (UniqueName: \"kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q\") pod \"heat-db-sync-97m76\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.934555 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bl4z7"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.935448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.940556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bl4z7"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.940844 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.949781 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cw9fr"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.950894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.965378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.965620 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x68h4" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.965943 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.966113 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nk4hc" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.967116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.967167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj9hv\" (UniqueName: \"kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.967273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.973624 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.975332 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.977996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-97m76" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.979679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.985985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.986592 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:28:59 crc kubenswrapper[4735]: I0317 01:28:59.986786 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.026066 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cw9fr"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.075186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080524 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080790 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfxm\" (UniqueName: \"kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.080097 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj9hv\" (UniqueName: \"kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv\") pod \"neutron-db-sync-6v4rh\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.104208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.104302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.118810 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.126716 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.159314 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.205960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206073 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8slq\" (UniqueName: \"kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfxm\" (UniqueName: \"kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.206360 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.215667 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.227716 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.241579 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.242888 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.246254 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.246530 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.250117 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.250284 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-db8vv" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.256242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfxm\" (UniqueName: \"kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.256259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.264885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.266246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.267368 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.268707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.271296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.279643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc\") pod \"barbican-db-sync-cw9fr\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.279977 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data\") pod \"cinder-db-sync-bl4z7\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.284322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.300249 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.308947 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.308986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309071 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8slq\" (UniqueName: \"kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309099 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.309520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.315124 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.319384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.319920 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.322115 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.323473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.323900 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.330062 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.330270 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.332906 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.334540 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.343814 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.381141 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-94fhf"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.382182 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.388725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8slq\" (UniqueName: \"kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq\") pod \"ceilometer-0\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.408326 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfjgp" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.408502 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.408602 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.411892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.411930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.411960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzg49\" (UniqueName: \"kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.411986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412004 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412038 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412056 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphx9\" (UniqueName: \"kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412123 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412172 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412198 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412247 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412265 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412284 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rzs\" (UniqueName: \"kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.412307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.445097 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-94fhf"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.485435 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.510459 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.511790 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515916 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515953 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.515986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516027 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rzs\" (UniqueName: \"kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516111 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzg49\" (UniqueName: \"kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516186 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516216 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphx9\" (UniqueName: \"kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516249 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4wm\" (UniqueName: \"kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.516349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.518463 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.519542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.520247 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.524833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.527143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.527374 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.528548 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.529124 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.530228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.531130 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.532542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.536056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.540024 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.540355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.544750 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.548776 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzg49\" (UniqueName: \"kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.550476 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.550998 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rzs\" (UniqueName: \"kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.551503 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.557836 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.563600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.582316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key\") pod \"horizon-75f648647c-q848z\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.593705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphx9\" (UniqueName: \"kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.640611 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4wm\" (UniqueName: \"kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6kgn\" (UniqueName: \"kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.641990 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.644103 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.653195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.668004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.682316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4wm\" (UniqueName: \"kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.682461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.683086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.685945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.686765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts\") pod \"placement-db-sync-94fhf\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.744777 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6kgn\" (UniqueName: \"kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.745089 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.745128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.745212 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.745280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.745307 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.746098 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.746800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.747325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.747791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.749388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.756980 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.770612 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6kgn\" (UniqueName: \"kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn\") pod \"dnsmasq-dns-6b8b7566d9-cvgwg\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.796798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.840637 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.922831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:00 crc kubenswrapper[4735]: I0317 01:29:00.971112 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.101776 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-97m76"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.108803 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cw9fr"] Mar 17 01:29:01 crc kubenswrapper[4735]: W0317 01:29:01.183575 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod997293a3_f292_4059_bb3b_da392c68ea99.slice/crio-d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14 WatchSource:0}: Error finding container d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14: Status 404 returned error can't find the container with id d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14 Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.199136 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7wlrb"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.217324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-97m76" event={"ID":"f4f6de0a-1a49-4470-80f3-5c807d5899a4","Type":"ContainerStarted","Data":"6c59e39f44d59fef9cbfc73d5b4e591e433513362c5f9d2283063b5ee960942f"} Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.228425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" event={"ID":"ce071301-a0a5-44bf-b166-809060e71f64","Type":"ContainerStarted","Data":"6144b9359fd21a1699e223e7051cbe88d203afb5106d627d70d8cdf4297f782a"} Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.245587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cw9fr" event={"ID":"322cb9be-148e-4dc0-8e9c-ae716ed6925f","Type":"ContainerStarted","Data":"a7972b5f934a0773f836db290b17b1e4f415cb080686081d07db6446fbe6aaa6"} Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.318673 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-6v4rh"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.332172 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.603915 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.618560 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-94fhf"] Mar 17 01:29:01 crc kubenswrapper[4735]: I0317 01:29:01.798241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bl4z7"] Mar 17 01:29:01 crc kubenswrapper[4735]: W0317 01:29:01.909515 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcfd83df_2abe_43e9_ab0b_88ec269eb204.slice/crio-2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5 WatchSource:0}: Error finding container 2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5: Status 404 returned error can't find the container with id 2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5 Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.013675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.047617 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.132381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.256011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75f648647c-q848z" event={"ID":"b53269e6-dba4-4c47-85c0-38a04c7c760a","Type":"ContainerStarted","Data":"4194129c91af774b2d220bd0ae8aab11313db9b67b9cd3b0bae8cbf186b9992b"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.264938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bl4z7" event={"ID":"bcfd83df-2abe-43e9-ab0b-88ec269eb204","Type":"ContainerStarted","Data":"2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.267307 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-94fhf" event={"ID":"4010b52e-ec5e-431e-9a94-48a03dfdcce6","Type":"ContainerStarted","Data":"1d2a5bf899a2cd868e4e65f248cc9a0d7b0d6f881aa9969d99d9bf1647d7cb58"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.268798 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wlrb" event={"ID":"997293a3-f292-4059-bb3b-da392c68ea99","Type":"ContainerStarted","Data":"ac7adbd2d4d069cd0191b2861ad467436e6dd5b80fd688426f3f0caaf4d4048e"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.268819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wlrb" event={"ID":"997293a3-f292-4059-bb3b-da392c68ea99","Type":"ContainerStarted","Data":"d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.272880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b969888f-hbb2b" event={"ID":"97ca4838-156f-4c10-83f4-cae52f5145b8","Type":"ContainerStarted","Data":"4a565031903a99cc1ae2f356357595406bbc668c7cf0ff780fcefe3b4564cdab"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.296848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6v4rh" event={"ID":"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e","Type":"ContainerStarted","Data":"e3eeba2f5c376c0fd87d093ab9ebbfc8f61c30f847d61eccea94c58aa6db7eed"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.296904 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6v4rh" event={"ID":"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e","Type":"ContainerStarted","Data":"6e676bb8b24762dfc340645ad4da9a53a8492957536d648c837fa90b5ff5649f"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.303668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerStarted","Data":"a54b737ba6e097676685ad7b06d4132c2462f6fd22e755d98e38eccecf6894b8"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.305270 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerStarted","Data":"9638b436c7a4e07c8e54ba234e95e3f3e554adf1202e4a3d969fbfe29eab847b"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.311711 4735 generic.go:334] "Generic (PLEG): container finished" podID="ce071301-a0a5-44bf-b166-809060e71f64" containerID="e70af8d3f0d9c238487e1c280e54e38119e77412aedf9957ca42e6dee4c84ad9" exitCode=0 Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.311767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" event={"ID":"ce071301-a0a5-44bf-b166-809060e71f64","Type":"ContainerDied","Data":"e70af8d3f0d9c238487e1c280e54e38119e77412aedf9957ca42e6dee4c84ad9"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.315745 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7wlrb" podStartSLOduration=3.315728746 podStartE2EDuration="3.315728746s" podCreationTimestamp="2026-03-17 01:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:02.294625516 +0000 UTC m=+1167.926858514" watchObservedRunningTime="2026-03-17 01:29:02.315728746 +0000 UTC m=+1167.947961724" Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.325829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" event={"ID":"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a","Type":"ContainerStarted","Data":"7b6ba21185d179c465e47e5c3afbd1d5d66af8e199b1b5bdb6a32fdae5fd7189"} Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.329179 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-6v4rh" podStartSLOduration=3.3291634070000002 podStartE2EDuration="3.329163407s" podCreationTimestamp="2026-03-17 01:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:02.324830041 +0000 UTC m=+1167.957063019" watchObservedRunningTime="2026-03-17 01:29:02.329163407 +0000 UTC m=+1167.961396385" Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.857044 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.917720 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.917786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.917966 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.918061 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.918109 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpv8g\" (UniqueName: \"kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.918133 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc\") pod \"ce071301-a0a5-44bf-b166-809060e71f64\" (UID: \"ce071301-a0a5-44bf-b166-809060e71f64\") " Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.934996 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g" (OuterVolumeSpecName: "kube-api-access-qpv8g") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "kube-api-access-qpv8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.959813 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:02 crc kubenswrapper[4735]: I0317 01:29:02.976149 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.007289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.031685 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config" (OuterVolumeSpecName: "config") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.034619 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.035747 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpv8g\" (UniqueName: \"kubernetes.io/projected/ce071301-a0a5-44bf-b166-809060e71f64-kube-api-access-qpv8g\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.035762 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.035777 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.035785 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.035794 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.179360 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce071301-a0a5-44bf-b166-809060e71f64" (UID: "ce071301-a0a5-44bf-b166-809060e71f64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.227746 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.269880 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce071301-a0a5-44bf-b166-809060e71f64-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.352624 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.378499 4735 generic.go:334] "Generic (PLEG): container finished" podID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerID="e815b2dc70a2e5fe384e47f3c1da4cc36f7728f0190229700bf4866bc2a79f3b" exitCode=0 Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.378552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" event={"ID":"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a","Type":"ContainerDied","Data":"e815b2dc70a2e5fe384e47f3c1da4cc36f7728f0190229700bf4866bc2a79f3b"} Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.383902 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.386199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" event={"ID":"ce071301-a0a5-44bf-b166-809060e71f64","Type":"ContainerDied","Data":"6144b9359fd21a1699e223e7051cbe88d203afb5106d627d70d8cdf4297f782a"} Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.386516 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b968d4c5-qlg46" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.400404 4735 scope.go:117] "RemoveContainer" containerID="e70af8d3f0d9c238487e1c280e54e38119e77412aedf9957ca42e6dee4c84ad9" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.408787 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.426040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerStarted","Data":"c58c9142f3d9cbaa0545849e1feb2a2d77113f6dc32e06ce2d48f0f1bb3d1e02"} Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.457474 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:03 crc kubenswrapper[4735]: E0317 01:29:03.466210 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce071301-a0a5-44bf-b166-809060e71f64" containerName="init" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.466416 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce071301-a0a5-44bf-b166-809060e71f64" containerName="init" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.466676 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce071301-a0a5-44bf-b166-809060e71f64" containerName="init" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.468473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.480269 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.524177 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.555642 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b968d4c5-qlg46"] Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.583139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.583187 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.583281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dhv\" (UniqueName: \"kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.583320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.583351 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.689001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.689055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.689129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.689161 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.689268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dhv\" (UniqueName: \"kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.690380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.690896 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.691197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.701613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.706644 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dhv\" (UniqueName: \"kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv\") pod \"horizon-5784c9cff5-vsxhg\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:03 crc kubenswrapper[4735]: I0317 01:29:03.986658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:04 crc kubenswrapper[4735]: I0317 01:29:04.507323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerStarted","Data":"2a3d731465e25f4a4b23eb9e3a3e584c7c2afff012d3fb5a34661f7d178dc4b4"} Mar 17 01:29:04 crc kubenswrapper[4735]: I0317 01:29:04.551249 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:04 crc kubenswrapper[4735]: W0317 01:29:04.570718 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb7f27a9_1f78_4033_8b04_4a5adf25024f.slice/crio-54181bda33711c857fabdc5a179b2d1e75b3f74eeda7f09cdf8c2b20af232d31 WatchSource:0}: Error finding container 54181bda33711c857fabdc5a179b2d1e75b3f74eeda7f09cdf8c2b20af232d31: Status 404 returned error can't find the container with id 54181bda33711c857fabdc5a179b2d1e75b3f74eeda7f09cdf8c2b20af232d31 Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.092734 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce071301-a0a5-44bf-b166-809060e71f64" path="/var/lib/kubelet/pods/ce071301-a0a5-44bf-b166-809060e71f64/volumes" Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.569371 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" event={"ID":"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a","Type":"ContainerStarted","Data":"bb2c4098e1e6d7df14476cb04bda10bf346296350b7bbaf6ac2863084548a391"} Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.571314 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.603316 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" podStartSLOduration=5.603296474 podStartE2EDuration="5.603296474s" podCreationTimestamp="2026-03-17 01:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:05.593693377 +0000 UTC m=+1171.225926355" watchObservedRunningTime="2026-03-17 01:29:05.603296474 +0000 UTC m=+1171.235529452" Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.608002 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-log" containerID="cri-o://2a3d731465e25f4a4b23eb9e3a3e584c7c2afff012d3fb5a34661f7d178dc4b4" gracePeriod=30 Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.608134 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerStarted","Data":"ff3601209ee21bc8748e566c487c634115cadd1a69098266f4f41830812a8cfc"} Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.608268 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-httpd" containerID="cri-o://ff3601209ee21bc8748e566c487c634115cadd1a69098266f4f41830812a8cfc" gracePeriod=30 Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.617630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerStarted","Data":"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad"} Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.620056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5784c9cff5-vsxhg" event={"ID":"fb7f27a9-1f78-4033-8b04-4a5adf25024f","Type":"ContainerStarted","Data":"54181bda33711c857fabdc5a179b2d1e75b3f74eeda7f09cdf8c2b20af232d31"} Mar 17 01:29:05 crc kubenswrapper[4735]: I0317 01:29:05.647169 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.647151924 podStartE2EDuration="5.647151924s" podCreationTimestamp="2026-03-17 01:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:05.641878575 +0000 UTC m=+1171.274111553" watchObservedRunningTime="2026-03-17 01:29:05.647151924 +0000 UTC m=+1171.279384902" Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.637968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerStarted","Data":"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d"} Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.638407 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-log" containerID="cri-o://5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" gracePeriod=30 Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.638913 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-httpd" containerID="cri-o://43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" gracePeriod=30 Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.643064 4735 generic.go:334] "Generic (PLEG): container finished" podID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerID="2a3d731465e25f4a4b23eb9e3a3e584c7c2afff012d3fb5a34661f7d178dc4b4" exitCode=143 Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.643212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerDied","Data":"2a3d731465e25f4a4b23eb9e3a3e584c7c2afff012d3fb5a34661f7d178dc4b4"} Mar 17 01:29:06 crc kubenswrapper[4735]: I0317 01:29:06.668969 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.668949792 podStartE2EDuration="6.668949792s" podCreationTimestamp="2026-03-17 01:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:06.66110509 +0000 UTC m=+1172.293338068" watchObservedRunningTime="2026-03-17 01:29:06.668949792 +0000 UTC m=+1172.301182770" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.551947 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.679907 4735 generic.go:334] "Generic (PLEG): container finished" podID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerID="ff3601209ee21bc8748e566c487c634115cadd1a69098266f4f41830812a8cfc" exitCode=0 Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.680089 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerDied","Data":"ff3601209ee21bc8748e566c487c634115cadd1a69098266f4f41830812a8cfc"} Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.691771 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.696552 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.696590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697022 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697157 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697186 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphx9\" (UniqueName: \"kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697271 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697342 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"717802f9-91db-401b-aabc-76fcb72901c4\" (UID: \"717802f9-91db-401b-aabc-76fcb72901c4\") " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.697482 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs" (OuterVolumeSpecName: "logs") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.698080 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.702746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts" (OuterVolumeSpecName: "scripts") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.705879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.724057 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.728277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9" (OuterVolumeSpecName: "kube-api-access-qphx9") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "kube-api-access-qphx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.728386 4735 generic.go:334] "Generic (PLEG): container finished" podID="717802f9-91db-401b-aabc-76fcb72901c4" containerID="43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" exitCode=0 Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.728414 4735 generic.go:334] "Generic (PLEG): container finished" podID="717802f9-91db-401b-aabc-76fcb72901c4" containerID="5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" exitCode=143 Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.728555 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.729238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerDied","Data":"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d"} Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.729297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerDied","Data":"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad"} Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.729311 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"717802f9-91db-401b-aabc-76fcb72901c4","Type":"ContainerDied","Data":"c58c9142f3d9cbaa0545849e1feb2a2d77113f6dc32e06ce2d48f0f1bb3d1e02"} Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.729326 4735 scope.go:117] "RemoveContainer" containerID="43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.743598 4735 generic.go:334] "Generic (PLEG): container finished" podID="997293a3-f292-4059-bb3b-da392c68ea99" containerID="ac7adbd2d4d069cd0191b2861ad467436e6dd5b80fd688426f3f0caaf4d4048e" exitCode=0 Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.744228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wlrb" event={"ID":"997293a3-f292-4059-bb3b-da392c68ea99","Type":"ContainerDied","Data":"ac7adbd2d4d069cd0191b2861ad467436e6dd5b80fd688426f3f0caaf4d4048e"} Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.767719 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.778120 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data" (OuterVolumeSpecName: "config-data") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.794069 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "717802f9-91db-401b-aabc-76fcb72901c4" (UID: "717802f9-91db-401b-aabc-76fcb72901c4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800139 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800162 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800171 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800180 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphx9\" (UniqueName: \"kubernetes.io/projected/717802f9-91db-401b-aabc-76fcb72901c4-kube-api-access-qphx9\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800189 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/717802f9-91db-401b-aabc-76fcb72901c4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800209 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.800218 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717802f9-91db-401b-aabc-76fcb72901c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.865396 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.880554 4735 scope.go:117] "RemoveContainer" containerID="5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.898133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.902448 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.956830 4735 scope.go:117] "RemoveContainer" containerID="43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" Mar 17 01:29:07 crc kubenswrapper[4735]: E0317 01:29:07.957583 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d\": container with ID starting with 43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d not found: ID does not exist" containerID="43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.957625 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d"} err="failed to get container status \"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d\": rpc error: code = NotFound desc = could not find container \"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d\": container with ID starting with 43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d not found: ID does not exist" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.957651 4735 scope.go:117] "RemoveContainer" containerID="5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" Mar 17 01:29:07 crc kubenswrapper[4735]: E0317 01:29:07.958072 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad\": container with ID starting with 5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad not found: ID does not exist" containerID="5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.958101 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad"} err="failed to get container status \"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad\": rpc error: code = NotFound desc = could not find container \"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad\": container with ID starting with 5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad not found: ID does not exist" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.958120 4735 scope.go:117] "RemoveContainer" containerID="43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.958443 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d"} err="failed to get container status \"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d\": rpc error: code = NotFound desc = could not find container \"43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d\": container with ID starting with 43521421c82e5c82ab11d6596b3020599e0d2fb96699b139c01698271978a52d not found: ID does not exist" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.958461 4735 scope.go:117] "RemoveContainer" containerID="5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad" Mar 17 01:29:07 crc kubenswrapper[4735]: I0317 01:29:07.958639 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad"} err="failed to get container status \"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad\": rpc error: code = NotFound desc = could not find container \"5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad\": container with ID starting with 5921472a7e235c2c9cb1127c1c84e5b083afe4bc0aae70edbb06fd86e5a4b4ad not found: ID does not exist" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004120 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004214 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004239 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzg49\" (UniqueName: \"kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004317 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.004462 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data\") pod \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\" (UID: \"cba3993f-b584-4b37-bdf8-c7e254c24dc7\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.008534 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs" (OuterVolumeSpecName: "logs") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.008730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49" (OuterVolumeSpecName: "kube-api-access-zzg49") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "kube-api-access-zzg49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.009042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.022704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.025772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts" (OuterVolumeSpecName: "scripts") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.087055 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.092012 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data" (OuterVolumeSpecName: "config-data") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110693 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzg49\" (UniqueName: \"kubernetes.io/projected/cba3993f-b584-4b37-bdf8-c7e254c24dc7-kube-api-access-zzg49\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110726 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110737 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba3993f-b584-4b37-bdf8-c7e254c24dc7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110750 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110759 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110786 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.110794 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.142919 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba3993f-b584-4b37-bdf8-c7e254c24dc7" (UID: "cba3993f-b584-4b37-bdf8-c7e254c24dc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.149968 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.214069 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.216002 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.216029 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba3993f-b584-4b37-bdf8-c7e254c24dc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.232530 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259203 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: E0317 01:29:08.259574 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259587 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: E0317 01:29:08.259595 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259601 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: E0317 01:29:08.259616 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259622 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: E0317 01:29:08.259630 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259636 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259793 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259815 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259825 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" containerName="glance-log" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.259833 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="717802f9-91db-401b-aabc-76fcb72901c4" containerName="glance-httpd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.260640 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.270246 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.270612 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.270926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.291365 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.302751 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.304116 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.313008 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.319885 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.319944 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.319971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.320011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.320039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.320062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.320110 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txqz7\" (UniqueName: \"kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.320134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.339188 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.411784 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: E0317 01:29:08.412503 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-txqz7 logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="e7cc1d67-ef60-43c0-bd9f-374236e5ca89" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.421885 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txqz7\" (UniqueName: \"kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422422 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jpx\" (UniqueName: \"kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.422918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423210 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423306 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423526 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.423609 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.432043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.432631 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.432820 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.435291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.435544 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.443729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.444147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.455267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.459593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txqz7\" (UniqueName: \"kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.476185 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76cdf95cd8-vx5pd"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.477519 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.482428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.492000 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76cdf95cd8-vx5pd"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jpx\" (UniqueName: \"kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525192 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-secret-key\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-scripts\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-tls-certs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3f6d90-40e7-4962-b788-1e9924edb48f-logs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-combined-ca-bundle\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-config-data\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525451 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x25kp\" (UniqueName: \"kubernetes.io/projected/fc3f6d90-40e7-4962-b788-1e9924edb48f-kube-api-access-x25kp\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.525489 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.528394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.528946 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.532502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.541059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jpx\" (UniqueName: \"kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.543320 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.543655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.546569 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key\") pod \"horizon-674b457696-6r8nd\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-secret-key\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-scripts\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-tls-certs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627218 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3f6d90-40e7-4962-b788-1e9924edb48f-logs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627237 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-combined-ca-bundle\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-config-data\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.627310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x25kp\" (UniqueName: \"kubernetes.io/projected/fc3f6d90-40e7-4962-b788-1e9924edb48f-kube-api-access-x25kp\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.630951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-scripts\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.631629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-secret-key\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.632079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3f6d90-40e7-4962-b788-1e9924edb48f-logs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.632689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f6d90-40e7-4962-b788-1e9924edb48f-config-data\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.643390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-horizon-tls-certs\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.644571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x25kp\" (UniqueName: \"kubernetes.io/projected/fc3f6d90-40e7-4962-b788-1e9924edb48f-kube-api-access-x25kp\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.645583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3f6d90-40e7-4962-b788-1e9924edb48f-combined-ca-bundle\") pod \"horizon-76cdf95cd8-vx5pd\" (UID: \"fc3f6d90-40e7-4962-b788-1e9924edb48f\") " pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.657708 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.755973 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.756541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.760137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cba3993f-b584-4b37-bdf8-c7e254c24dc7","Type":"ContainerDied","Data":"a54b737ba6e097676685ad7b06d4132c2462f6fd22e755d98e38eccecf6894b8"} Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.760194 4735 scope.go:117] "RemoveContainer" containerID="ff3601209ee21bc8748e566c487c634115cadd1a69098266f4f41830812a8cfc" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.816218 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.819658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.844324 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.870820 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.897453 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.901705 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.924287 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.924290 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.924463 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942532 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942631 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txqz7\" (UniqueName: \"kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942975 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.942991 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts\") pod \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\" (UID: \"e7cc1d67-ef60-43c0-bd9f-374236e5ca89\") " Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.948302 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.950577 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs" (OuterVolumeSpecName: "logs") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.951061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.953239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts" (OuterVolumeSpecName: "scripts") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.955005 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7" (OuterVolumeSpecName: "kube-api-access-txqz7") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "kube-api-access-txqz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.956246 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.961278 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:08 crc kubenswrapper[4735]: I0317 01:29:08.970370 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data" (OuterVolumeSpecName: "config-data") pod "e7cc1d67-ef60-43c0-bd9f-374236e5ca89" (UID: "e7cc1d67-ef60-43c0-bd9f-374236e5ca89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048788 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048813 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048875 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048905 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6knf\" (UniqueName: \"kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.048943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049006 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049019 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049028 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049039 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txqz7\" (UniqueName: \"kubernetes.io/projected/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-kube-api-access-txqz7\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049047 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049054 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049062 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cc1d67-ef60-43c0-bd9f-374236e5ca89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.049080 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.067219 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.097143 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717802f9-91db-401b-aabc-76fcb72901c4" path="/var/lib/kubelet/pods/717802f9-91db-401b-aabc-76fcb72901c4/volumes" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.097956 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba3993f-b584-4b37-bdf8-c7e254c24dc7" path="/var/lib/kubelet/pods/cba3993f-b584-4b37-bdf8-c7e254c24dc7/volumes" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6knf\" (UniqueName: \"kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150560 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150586 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.150740 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.151159 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.154212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.155021 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.156807 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.159027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.160020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.178686 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6knf\" (UniqueName: \"kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.185970 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.195674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.275646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.790694 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.877087 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.898594 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.912430 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.913906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.916450 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.916596 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.919083 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971297 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lfv7\" (UniqueName: \"kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971362 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:09 crc kubenswrapper[4735]: I0317 01:29:09.971382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073823 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lfv7\" (UniqueName: \"kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.073982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.074005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.074032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.074518 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.075709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.076443 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.087956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.093500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.094738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lfv7\" (UniqueName: \"kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.095395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.098187 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.106056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.233342 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.798532 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.858687 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:29:10 crc kubenswrapper[4735]: I0317 01:29:10.858915 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" containerID="cri-o://360cd0c8d15605e0761da4bb5a065d5ffb02cf915d548deb946f2e90553c77f1" gracePeriod=10 Mar 17 01:29:11 crc kubenswrapper[4735]: I0317 01:29:11.093417 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cc1d67-ef60-43c0-bd9f-374236e5ca89" path="/var/lib/kubelet/pods/e7cc1d67-ef60-43c0-bd9f-374236e5ca89/volumes" Mar 17 01:29:11 crc kubenswrapper[4735]: I0317 01:29:11.822301 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerID="360cd0c8d15605e0761da4bb5a065d5ffb02cf915d548deb946f2e90553c77f1" exitCode=0 Mar 17 01:29:11 crc kubenswrapper[4735]: I0317 01:29:11.822342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" event={"ID":"f9283346-2c2b-45c9-90f0-ef25dcc0a250","Type":"ContainerDied","Data":"360cd0c8d15605e0761da4bb5a065d5ffb02cf915d548deb946f2e90553c77f1"} Mar 17 01:29:12 crc kubenswrapper[4735]: I0317 01:29:12.606363 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:29:12 crc kubenswrapper[4735]: I0317 01:29:12.606634 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.753440 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.872597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wlrb" event={"ID":"997293a3-f292-4059-bb3b-da392c68ea99","Type":"ContainerDied","Data":"d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14"} Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.872631 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d812cb6fa9c268e98898f99b12de5c81b6fbd96b086c200f4cefe8505371aa14" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.873170 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wlrb" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.875736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6llx7\" (UniqueName: \"kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.875893 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.875961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.876659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.876720 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.876753 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys\") pod \"997293a3-f292-4059-bb3b-da392c68ea99\" (UID: \"997293a3-f292-4059-bb3b-da392c68ea99\") " Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.887484 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts" (OuterVolumeSpecName: "scripts") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.888984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7" (OuterVolumeSpecName: "kube-api-access-6llx7") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "kube-api-access-6llx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.909822 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.910279 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.911256 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.913750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data" (OuterVolumeSpecName: "config-data") pod "997293a3-f292-4059-bb3b-da392c68ea99" (UID: "997293a3-f292-4059-bb3b-da392c68ea99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979437 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979473 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979482 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979492 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979501 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/997293a3-f292-4059-bb3b-da392c68ea99-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:14 crc kubenswrapper[4735]: I0317 01:29:14.979509 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6llx7\" (UniqueName: \"kubernetes.io/projected/997293a3-f292-4059-bb3b-da392c68ea99-kube-api-access-6llx7\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.837620 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7wlrb"] Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.843723 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7wlrb"] Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.939576 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s6vdx"] Mar 17 01:29:15 crc kubenswrapper[4735]: E0317 01:29:15.940041 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="997293a3-f292-4059-bb3b-da392c68ea99" containerName="keystone-bootstrap" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.940053 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="997293a3-f292-4059-bb3b-da392c68ea99" containerName="keystone-bootstrap" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.940225 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="997293a3-f292-4059-bb3b-da392c68ea99" containerName="keystone-bootstrap" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.940649 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s6vdx"] Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.940714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.956016 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.956120 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.956181 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.956011 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p2vmn" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.956356 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mrl\" (UniqueName: \"kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995362 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:15 crc kubenswrapper[4735]: I0317 01:29:15.995458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097575 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097616 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mrl\" (UniqueName: \"kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.097687 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.103224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.103391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.113512 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.114171 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.119085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.130444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mrl\" (UniqueName: \"kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl\") pod \"keystone-bootstrap-s6vdx\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:16 crc kubenswrapper[4735]: I0317 01:29:16.288839 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:17 crc kubenswrapper[4735]: I0317 01:29:17.088038 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="997293a3-f292-4059-bb3b-da392c68ea99" path="/var/lib/kubelet/pods/997293a3-f292-4059-bb3b-da392c68ea99/volumes" Mar 17 01:29:20 crc kubenswrapper[4735]: I0317 01:29:20.432164 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Mar 17 01:29:21 crc kubenswrapper[4735]: I0317 01:29:21.937730 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" containerID="e3eeba2f5c376c0fd87d093ab9ebbfc8f61c30f847d61eccea94c58aa6db7eed" exitCode=0 Mar 17 01:29:21 crc kubenswrapper[4735]: I0317 01:29:21.937791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6v4rh" event={"ID":"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e","Type":"ContainerDied","Data":"e3eeba2f5c376c0fd87d093ab9ebbfc8f61c30f847d61eccea94c58aa6db7eed"} Mar 17 01:29:22 crc kubenswrapper[4735]: E0317 01:29:22.821400 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:22 crc kubenswrapper[4735]: E0317 01:29:22.822124 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:22 crc kubenswrapper[4735]: E0317 01:29:22.822731 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59bh6hc8h685h585hf7h78h674h697h644h597h667h649h67fh687h685h96h65ch65dhfh66bh64ch697h54h6h68h549hfh58dh5bh68dhbcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rn65f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55b969888f-hbb2b_openstack(97ca4838-156f-4c10-83f4-cae52f5145b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:22 crc kubenswrapper[4735]: E0317 01:29:22.826077 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9\\\"\"]" pod="openstack/horizon-55b969888f-hbb2b" podUID="97ca4838-156f-4c10-83f4-cae52f5145b8" Mar 17 01:29:25 crc kubenswrapper[4735]: I0317 01:29:25.432975 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.800340 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.800709 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.800876 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n696h5b4h5bdhf6h694hddh68bh8dh659h5bfh686h689h65bh645h565h5d6h9dh99h5cch5bdh5d5hdh58dh676h9fh96h648h697h59fhdfh686h56q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j5dhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5784c9cff5-vsxhg_openstack(fb7f27a9-1f78-4033-8b04-4a5adf25024f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.806149 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9\\\"\"]" pod="openstack/horizon-5784c9cff5-vsxhg" podUID="fb7f27a9-1f78-4033-8b04-4a5adf25024f" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.810677 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.810779 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.811009 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh586h687h548h5dbh659h58bh87h55ch5c8h556h695h598hbbhbchcfh579h696h89h9h87h56h588h568h656h57bh84h5ffh646h669h65fh566q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56rzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75f648647c-q848z_openstack(b53269e6-dba4-4c47-85c0-38a04c7c760a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:26 crc kubenswrapper[4735]: E0317 01:29:26.817751 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:e43235cb19da04699a53f42b6a75afe9\\\"\"]" pod="openstack/horizon-75f648647c-q848z" podUID="b53269e6-dba4-4c47-85c0-38a04c7c760a" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.236673 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:29:27 crc kubenswrapper[4735]: E0317 01:29:27.429114 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:27 crc kubenswrapper[4735]: E0317 01:29:27.429177 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:27 crc kubenswrapper[4735]: E0317 01:29:27.429352 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65h567h55dh5c5hdh555h658h87h55ch5dbh5d9h697h5bh679h5fch548h6fh6h5c8h57fh57ch78h565h54bh59h56h5cfh678h544hb4h84h9dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8slq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(72dc2896-7cb6-4385-91a9-ae28c7f8907a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.436133 4735 scope.go:117] "RemoveContainer" containerID="2a3d731465e25f4a4b23eb9e3a3e584c7c2afff012d3fb5a34661f7d178dc4b4" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.538042 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615076 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615481 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615549 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615599 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615631 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.615655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wmp5\" (UniqueName: \"kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5\") pod \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\" (UID: \"f9283346-2c2b-45c9-90f0-ef25dcc0a250\") " Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.621991 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5" (OuterVolumeSpecName: "kube-api-access-5wmp5") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "kube-api-access-5wmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.655568 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.655596 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config" (OuterVolumeSpecName: "config") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.661016 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.665325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.678874 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9283346-2c2b-45c9-90f0-ef25dcc0a250" (UID: "f9283346-2c2b-45c9-90f0-ef25dcc0a250"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718879 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718922 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718939 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718953 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718967 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wmp5\" (UniqueName: \"kubernetes.io/projected/f9283346-2c2b-45c9-90f0-ef25dcc0a250-kube-api-access-5wmp5\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.718984 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9283346-2c2b-45c9-90f0-ef25dcc0a250-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.996492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" event={"ID":"f9283346-2c2b-45c9-90f0-ef25dcc0a250","Type":"ContainerDied","Data":"20e1f776d0261ffaeff5ae7e4f5b73ccd48b481022a63a447a978ba84e942ab6"} Mar 17 01:29:27 crc kubenswrapper[4735]: I0317 01:29:27.996597 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" Mar 17 01:29:28 crc kubenswrapper[4735]: I0317 01:29:28.032041 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:29:28 crc kubenswrapper[4735]: I0317 01:29:28.039817 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9dd5d975-62wvw"] Mar 17 01:29:29 crc kubenswrapper[4735]: I0317 01:29:29.084459 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" path="/var/lib/kubelet/pods/f9283346-2c2b-45c9-90f0-ef25dcc0a250/volumes" Mar 17 01:29:30 crc kubenswrapper[4735]: I0317 01:29:30.475430 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f9dd5d975-62wvw" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Mar 17 01:29:36 crc kubenswrapper[4735]: I0317 01:29:36.856659 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:29:36 crc kubenswrapper[4735]: I0317 01:29:36.862272 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data\") pod \"97ca4838-156f-4c10-83f4-cae52f5145b8\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043433 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle\") pod \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043488 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn65f\" (UniqueName: \"kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f\") pod \"97ca4838-156f-4c10-83f4-cae52f5145b8\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043521 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs\") pod \"97ca4838-156f-4c10-83f4-cae52f5145b8\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts\") pod \"97ca4838-156f-4c10-83f4-cae52f5145b8\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj9hv\" (UniqueName: \"kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv\") pod \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043726 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key\") pod \"97ca4838-156f-4c10-83f4-cae52f5145b8\" (UID: \"97ca4838-156f-4c10-83f4-cae52f5145b8\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.043782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config\") pod \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\" (UID: \"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs" (OuterVolumeSpecName: "logs") pod "97ca4838-156f-4c10-83f4-cae52f5145b8" (UID: "97ca4838-156f-4c10-83f4-cae52f5145b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044302 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts" (OuterVolumeSpecName: "scripts") pod "97ca4838-156f-4c10-83f4-cae52f5145b8" (UID: "97ca4838-156f-4c10-83f4-cae52f5145b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044522 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data" (OuterVolumeSpecName: "config-data") pod "97ca4838-156f-4c10-83f4-cae52f5145b8" (UID: "97ca4838-156f-4c10-83f4-cae52f5145b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044616 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ca4838-156f-4c10-83f4-cae52f5145b8-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044641 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.044655 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ca4838-156f-4c10-83f4-cae52f5145b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.053432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv" (OuterVolumeSpecName: "kube-api-access-cj9hv") pod "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" (UID: "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e"). InnerVolumeSpecName "kube-api-access-cj9hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.066456 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "97ca4838-156f-4c10-83f4-cae52f5145b8" (UID: "97ca4838-156f-4c10-83f4-cae52f5145b8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.068404 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f" (OuterVolumeSpecName: "kube-api-access-rn65f") pod "97ca4838-156f-4c10-83f4-cae52f5145b8" (UID: "97ca4838-156f-4c10-83f4-cae52f5145b8"). InnerVolumeSpecName "kube-api-access-rn65f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.084078 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config" (OuterVolumeSpecName: "config") pod "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" (UID: "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.091514 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" (UID: "2ef223ff-d0aa-44b8-b8cd-88242ceaee8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.103846 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b969888f-hbb2b" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.105618 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-6v4rh" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b969888f-hbb2b" event={"ID":"97ca4838-156f-4c10-83f4-cae52f5145b8","Type":"ContainerDied","Data":"4a565031903a99cc1ae2f356357595406bbc668c7cf0ff780fcefe3b4564cdab"} Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-6v4rh" event={"ID":"2ef223ff-d0aa-44b8-b8cd-88242ceaee8e","Type":"ContainerDied","Data":"6e676bb8b24762dfc340645ad4da9a53a8492957536d648c837fa90b5ff5649f"} Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145106 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e676bb8b24762dfc340645ad4da9a53a8492957536d648c837fa90b5ff5649f" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145818 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ca4838-156f-4c10-83f4-cae52f5145b8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145842 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145863 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145873 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn65f\" (UniqueName: \"kubernetes.io/projected/97ca4838-156f-4c10-83f4-cae52f5145b8-kube-api-access-rn65f\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.145882 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj9hv\" (UniqueName: \"kubernetes.io/projected/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e-kube-api-access-cj9hv\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.199868 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.219143 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55b969888f-hbb2b"] Mar 17 01:29:37 crc kubenswrapper[4735]: E0317 01:29:37.331964 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:37 crc kubenswrapper[4735]: E0317 01:29:37.332347 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:37 crc kubenswrapper[4735]: E0317 01:29:37.332482 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96m7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-97m76_openstack(f4f6de0a-1a49-4470-80f3-5c807d5899a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:37 crc kubenswrapper[4735]: E0317 01:29:37.341465 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-97m76" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.352987 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.454622 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs\") pod \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.454678 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5dhv\" (UniqueName: \"kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv\") pod \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.454703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data\") pod \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.454752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key\") pod \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.454789 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts\") pod \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\" (UID: \"fb7f27a9-1f78-4033-8b04-4a5adf25024f\") " Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.455671 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts" (OuterVolumeSpecName: "scripts") pod "fb7f27a9-1f78-4033-8b04-4a5adf25024f" (UID: "fb7f27a9-1f78-4033-8b04-4a5adf25024f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.455798 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs" (OuterVolumeSpecName: "logs") pod "fb7f27a9-1f78-4033-8b04-4a5adf25024f" (UID: "fb7f27a9-1f78-4033-8b04-4a5adf25024f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.456519 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data" (OuterVolumeSpecName: "config-data") pod "fb7f27a9-1f78-4033-8b04-4a5adf25024f" (UID: "fb7f27a9-1f78-4033-8b04-4a5adf25024f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.459398 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fb7f27a9-1f78-4033-8b04-4a5adf25024f" (UID: "fb7f27a9-1f78-4033-8b04-4a5adf25024f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.461463 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv" (OuterVolumeSpecName: "kube-api-access-j5dhv") pod "fb7f27a9-1f78-4033-8b04-4a5adf25024f" (UID: "fb7f27a9-1f78-4033-8b04-4a5adf25024f"). InnerVolumeSpecName "kube-api-access-j5dhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.557375 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb7f27a9-1f78-4033-8b04-4a5adf25024f-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.557416 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5dhv\" (UniqueName: \"kubernetes.io/projected/fb7f27a9-1f78-4033-8b04-4a5adf25024f-kube-api-access-j5dhv\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.557432 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.557445 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb7f27a9-1f78-4033-8b04-4a5adf25024f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:37 crc kubenswrapper[4735]: I0317 01:29:37.557459 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb7f27a9-1f78-4033-8b04-4a5adf25024f-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.140782 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:29:38 crc kubenswrapper[4735]: E0317 01:29:38.141821 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="init" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.141840 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="init" Mar 17 01:29:38 crc kubenswrapper[4735]: E0317 01:29:38.141879 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.141914 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" Mar 17 01:29:38 crc kubenswrapper[4735]: E0317 01:29:38.141930 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" containerName="neutron-db-sync" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.141937 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" containerName="neutron-db-sync" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.142101 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" containerName="neutron-db-sync" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.142131 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9283346-2c2b-45c9-90f0-ef25dcc0a250" containerName="dnsmasq-dns" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.143115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.148747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5784c9cff5-vsxhg" event={"ID":"fb7f27a9-1f78-4033-8b04-4a5adf25024f","Type":"ContainerDied","Data":"54181bda33711c857fabdc5a179b2d1e75b3f74eeda7f09cdf8c2b20af232d31"} Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.148875 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5784c9cff5-vsxhg" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.167678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerStarted","Data":"95a691c501f48dc3248fa26337e8f1aad8806a5f25e5ac42eab5117080d76469"} Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.167958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.168051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.168082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.168125 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72f4\" (UniqueName: \"kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.168155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.168275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: E0317 01:29:38.169425 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/heat-db-sync-97m76" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.191838 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269693 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269739 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72f4\" (UniqueName: \"kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.269839 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.271681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.271831 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.272441 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.272800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.272813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.293077 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.308347 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5784c9cff5-vsxhg"] Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.311730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72f4\" (UniqueName: \"kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4\") pod \"dnsmasq-dns-75c46b7ff7-gkr8h\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.344542 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.347709 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.350698 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.350811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.356674 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.356753 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zdzns" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.368068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.375434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.375887 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.376008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.376108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.376243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b74t5\" (UniqueName: \"kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.481929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.482055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.482132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.482201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.482252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b74t5\" (UniqueName: \"kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.482828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.487896 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.488653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.492222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.505197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.517014 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b74t5\" (UniqueName: \"kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5\") pod \"neutron-fbbbfb4-v48lt\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:38 crc kubenswrapper[4735]: I0317 01:29:38.678939 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.082135 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ca4838-156f-4c10-83f4-cae52f5145b8" path="/var/lib/kubelet/pods/97ca4838-156f-4c10-83f4-cae52f5145b8/volumes" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.082561 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7f27a9-1f78-4033-8b04-4a5adf25024f" path="/var/lib/kubelet/pods/fb7f27a9-1f78-4033-8b04-4a5adf25024f/volumes" Mar 17 01:29:39 crc kubenswrapper[4735]: E0317 01:29:39.352385 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:39 crc kubenswrapper[4735]: E0317 01:29:39.352680 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:29:39 crc kubenswrapper[4735]: E0317 01:29:39.352790 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:e43235cb19da04699a53f42b6a75afe9,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tfxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bl4z7_openstack(bcfd83df-2abe-43e9-ab0b-88ec269eb204): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:29:39 crc kubenswrapper[4735]: E0317 01:29:39.358698 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bl4z7" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.372273 4735 scope.go:117] "RemoveContainer" containerID="360cd0c8d15605e0761da4bb5a065d5ffb02cf915d548deb946f2e90553c77f1" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.528712 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.599784 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rzs\" (UniqueName: \"kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs\") pod \"b53269e6-dba4-4c47-85c0-38a04c7c760a\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.599817 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts\") pod \"b53269e6-dba4-4c47-85c0-38a04c7c760a\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.599891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key\") pod \"b53269e6-dba4-4c47-85c0-38a04c7c760a\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.599931 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data\") pod \"b53269e6-dba4-4c47-85c0-38a04c7c760a\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.600119 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs\") pod \"b53269e6-dba4-4c47-85c0-38a04c7c760a\" (UID: \"b53269e6-dba4-4c47-85c0-38a04c7c760a\") " Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.601359 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data" (OuterVolumeSpecName: "config-data") pod "b53269e6-dba4-4c47-85c0-38a04c7c760a" (UID: "b53269e6-dba4-4c47-85c0-38a04c7c760a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.601517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts" (OuterVolumeSpecName: "scripts") pod "b53269e6-dba4-4c47-85c0-38a04c7c760a" (UID: "b53269e6-dba4-4c47-85c0-38a04c7c760a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.606726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs" (OuterVolumeSpecName: "logs") pod "b53269e6-dba4-4c47-85c0-38a04c7c760a" (UID: "b53269e6-dba4-4c47-85c0-38a04c7c760a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.611995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b53269e6-dba4-4c47-85c0-38a04c7c760a" (UID: "b53269e6-dba4-4c47-85c0-38a04c7c760a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.624832 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs" (OuterVolumeSpecName: "kube-api-access-56rzs") pod "b53269e6-dba4-4c47-85c0-38a04c7c760a" (UID: "b53269e6-dba4-4c47-85c0-38a04c7c760a"). InnerVolumeSpecName "kube-api-access-56rzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.663019 4735 scope.go:117] "RemoveContainer" containerID="a08806e2f22728ee39b447fcdf79546d8164c13afe9d36b9914febe0696906c7" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.708110 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53269e6-dba4-4c47-85c0-38a04c7c760a-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.708134 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rzs\" (UniqueName: \"kubernetes.io/projected/b53269e6-dba4-4c47-85c0-38a04c7c760a-kube-api-access-56rzs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.708143 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.708152 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b53269e6-dba4-4c47-85c0-38a04c7c760a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4735]: I0317 01:29:39.708162 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b53269e6-dba4-4c47-85c0-38a04c7c760a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.153529 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.211430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerStarted","Data":"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9"} Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.217925 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75f648647c-q848z" event={"ID":"b53269e6-dba4-4c47-85c0-38a04c7c760a","Type":"ContainerDied","Data":"4194129c91af774b2d220bd0ae8aab11313db9b67b9cd3b0bae8cbf186b9992b"} Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.218060 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75f648647c-q848z" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.227053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cw9fr" event={"ID":"322cb9be-148e-4dc0-8e9c-ae716ed6925f","Type":"ContainerStarted","Data":"26f9ee346dc5c7523fb01f1805609575ccf514e9b54d884ebae6ca9193f32627"} Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.240254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-94fhf" event={"ID":"4010b52e-ec5e-431e-9a94-48a03dfdcce6","Type":"ContainerStarted","Data":"c21e38ee0e20a18a4196446c8deb43608008ad76851d0decefa2fa915c851264"} Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.242277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerStarted","Data":"7119e80535f25cd52c1996b4ec6b2329d20e0b561b9c5b091b6ebf2f58f891ea"} Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.259209 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cw9fr" podStartSLOduration=3.047562899 podStartE2EDuration="41.259187228s" podCreationTimestamp="2026-03-17 01:28:59 +0000 UTC" firstStartedPulling="2026-03-17 01:29:01.147578652 +0000 UTC m=+1166.779811620" lastFinishedPulling="2026-03-17 01:29:39.359202971 +0000 UTC m=+1204.991435949" observedRunningTime="2026-03-17 01:29:40.250827362 +0000 UTC m=+1205.883060340" watchObservedRunningTime="2026-03-17 01:29:40.259187228 +0000 UTC m=+1205.891420216" Mar 17 01:29:40 crc kubenswrapper[4735]: E0317 01:29:40.267951 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/cinder-db-sync-bl4z7" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.286080 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-94fhf" podStartSLOduration=2.661677311 podStartE2EDuration="40.28606334s" podCreationTimestamp="2026-03-17 01:29:00 +0000 UTC" firstStartedPulling="2026-03-17 01:29:01.694878748 +0000 UTC m=+1167.327111726" lastFinishedPulling="2026-03-17 01:29:39.319264777 +0000 UTC m=+1204.951497755" observedRunningTime="2026-03-17 01:29:40.283009055 +0000 UTC m=+1205.915242033" watchObservedRunningTime="2026-03-17 01:29:40.28606334 +0000 UTC m=+1205.918296318" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.369732 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:29:40 crc kubenswrapper[4735]: W0317 01:29:40.409414 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c5b000f_5d9d_4d80_b979_725e03851ba6.slice/crio-8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2 WatchSource:0}: Error finding container 8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2: Status 404 returned error can't find the container with id 8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2 Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.409495 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.433271 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75f648647c-q848z"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.434922 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s6vdx"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.448065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.566341 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76cdf95cd8-vx5pd"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.659198 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.800143 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.801830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.806196 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.806459 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.835812 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872946 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.872976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjtmh\" (UniqueName: \"kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.873017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974448 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjtmh\" (UniqueName: \"kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974575 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.974602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.979265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.982033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.983628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.984947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.986022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.987617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:40 crc kubenswrapper[4735]: I0317 01:29:40.993314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjtmh\" (UniqueName: \"kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh\") pod \"neutron-77b7845dbf-mbcjv\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.084522 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53269e6-dba4-4c47-85c0-38a04c7c760a" path="/var/lib/kubelet/pods/b53269e6-dba4-4c47-85c0-38a04c7c760a/volumes" Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.131502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:41 crc kubenswrapper[4735]: W0317 01:29:41.197935 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762ac106_44fe_4a09_9dcf_a55d7c4573fe.slice/crio-21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de WatchSource:0}: Error finding container 21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de: Status 404 returned error can't find the container with id 21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.266666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76cdf95cd8-vx5pd" event={"ID":"fc3f6d90-40e7-4962-b788-1e9924edb48f","Type":"ContainerStarted","Data":"894501b3e5fdf8dde781d1cb3c57faea6591706897ba3e735531244176709b01"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.277049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerStarted","Data":"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.278611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" event={"ID":"9753d2c8-2d80-4019-965a-af117590e79f","Type":"ContainerStarted","Data":"71b42a3379d6afdf6d6df2809665bf8e98cae69e39d4933facf703b0cc6306f9"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.304074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerStarted","Data":"f91aa2371c330a548923615f99c1fd2995b9c30c9ba7438cd3426ffed2aadb00"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.315354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s6vdx" event={"ID":"8c5b000f-5d9d-4d80-b979-725e03851ba6","Type":"ContainerStarted","Data":"68cb4547e382ebf1c68944841418b06d8ebdd75b3cce0044083697df9e27f405"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.315389 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s6vdx" event={"ID":"8c5b000f-5d9d-4d80-b979-725e03851ba6","Type":"ContainerStarted","Data":"8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.362039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerStarted","Data":"21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de"} Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.381545 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-674b457696-6r8nd" podStartSLOduration=31.186129215 podStartE2EDuration="33.381525912s" podCreationTimestamp="2026-03-17 01:29:08 +0000 UTC" firstStartedPulling="2026-03-17 01:29:37.321639974 +0000 UTC m=+1202.953872952" lastFinishedPulling="2026-03-17 01:29:39.517036681 +0000 UTC m=+1205.149269649" observedRunningTime="2026-03-17 01:29:41.304267359 +0000 UTC m=+1206.936500337" watchObservedRunningTime="2026-03-17 01:29:41.381525912 +0000 UTC m=+1207.013758890" Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.382642 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s6vdx" podStartSLOduration=26.38263683 podStartE2EDuration="26.38263683s" podCreationTimestamp="2026-03-17 01:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:41.352176709 +0000 UTC m=+1206.984409687" watchObservedRunningTime="2026-03-17 01:29:41.38263683 +0000 UTC m=+1207.014869808" Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.537686 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:29:41 crc kubenswrapper[4735]: W0317 01:29:41.610342 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f65e0a_d223_4c97_8911_d903950feb61.slice/crio-43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c WatchSource:0}: Error finding container 43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c: Status 404 returned error can't find the container with id 43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c Mar 17 01:29:41 crc kubenswrapper[4735]: I0317 01:29:41.995219 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.382102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerStarted","Data":"02563a274a4cdb0d52f3f87ac4883a2211dddd48de1c14bc5c16950952743054"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.398652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerStarted","Data":"0add670582044a8cc9dbed9bb3d32df42740b89106b4b002c039ddf98a005495"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.408075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerStarted","Data":"6c22d0a33b7a7feb5c716d97ef6a490ca1f3f90d8d3e30b968cf89ada6c7f3db"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.408383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerStarted","Data":"43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.426196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76cdf95cd8-vx5pd" event={"ID":"fc3f6d90-40e7-4962-b788-1e9924edb48f","Type":"ContainerStarted","Data":"504b89dfff2cc4a60c9c736b725c20fdc45de092ece295335ece8e3f09772451"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.426235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76cdf95cd8-vx5pd" event={"ID":"fc3f6d90-40e7-4962-b788-1e9924edb48f","Type":"ContainerStarted","Data":"abb517dad47b7d32caa32cd70fec5fb07e6e964b854054c8ef19a839fa43651d"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.444043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerStarted","Data":"42e38c0b456e388752e43b9caa8cfb5117ac218cad2a6a4ed2ea4e36c016b6fa"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.451620 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76cdf95cd8-vx5pd" podStartSLOduration=34.45160441 podStartE2EDuration="34.45160441s" podCreationTimestamp="2026-03-17 01:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:42.449117059 +0000 UTC m=+1208.081350037" watchObservedRunningTime="2026-03-17 01:29:42.45160441 +0000 UTC m=+1208.083837388" Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.457397 4735 generic.go:334] "Generic (PLEG): container finished" podID="9753d2c8-2d80-4019-965a-af117590e79f" containerID="3bacd3aafc06b6db3108ef9de8c2a3248c6ade2a6af01e1f1f7da54d51e19845" exitCode=0 Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.458844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" event={"ID":"9753d2c8-2d80-4019-965a-af117590e79f","Type":"ContainerDied","Data":"3bacd3aafc06b6db3108ef9de8c2a3248c6ade2a6af01e1f1f7da54d51e19845"} Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.607972 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.608018 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.608321 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.608931 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:29:42 crc kubenswrapper[4735]: I0317 01:29:42.608975 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a" gracePeriod=600 Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.477166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerStarted","Data":"8438c54bbe429594476cc802adbf16afabcc27cd0ebfba431d5a1cb790fc7682"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.477628 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.489535 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a" exitCode=0 Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.489611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.489646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.489668 4735 scope.go:117] "RemoveContainer" containerID="7547be811f368c1658ee9c2df155bfbafe2bd7de1f9eeaf1d1a2d6245d328110" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.498999 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fbbbfb4-v48lt" podStartSLOduration=5.498983029 podStartE2EDuration="5.498983029s" podCreationTimestamp="2026-03-17 01:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:43.497731838 +0000 UTC m=+1209.129964816" watchObservedRunningTime="2026-03-17 01:29:43.498983029 +0000 UTC m=+1209.131216007" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.500782 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerStarted","Data":"28057c7a8bb9f4b30d111669f97c3397827c24b0312f1d99a5cd90d8a0a1ce5f"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.516031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerStarted","Data":"c6f64c6db3dd3e5c32fbbded6c1c35b16caf87a21b0878cd42eb55043dafc2f8"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.516273 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerStarted","Data":"7fc5010997d8e172531eb08ef5e624286b7138788890be11979a5133c277fd45"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.516461 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.553186 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" event={"ID":"9753d2c8-2d80-4019-965a-af117590e79f","Type":"ContainerStarted","Data":"a680d6fbf902d3c23d3b87a260006e1e97ade12bbe5307e7ae17b7161fad046a"} Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.553317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.571683 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.57166505 podStartE2EDuration="34.57166505s" podCreationTimestamp="2026-03-17 01:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:43.541337012 +0000 UTC m=+1209.173569990" watchObservedRunningTime="2026-03-17 01:29:43.57166505 +0000 UTC m=+1209.203898028" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.584598 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77b7845dbf-mbcjv" podStartSLOduration=3.584574157 podStartE2EDuration="3.584574157s" podCreationTimestamp="2026-03-17 01:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:43.569565327 +0000 UTC m=+1209.201798305" watchObservedRunningTime="2026-03-17 01:29:43.584574157 +0000 UTC m=+1209.216807135" Mar 17 01:29:43 crc kubenswrapper[4735]: I0317 01:29:43.596800 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" podStartSLOduration=5.596785789 podStartE2EDuration="5.596785789s" podCreationTimestamp="2026-03-17 01:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:43.5895906 +0000 UTC m=+1209.221823578" watchObservedRunningTime="2026-03-17 01:29:43.596785789 +0000 UTC m=+1209.229018767" Mar 17 01:29:44 crc kubenswrapper[4735]: I0317 01:29:44.568080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerStarted","Data":"6e666d4e4c78c41a6d71e9a0fdf011dba9f4b17db66cfde061aff5aaf6438c05"} Mar 17 01:29:44 crc kubenswrapper[4735]: I0317 01:29:44.590888 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=36.590873864 podStartE2EDuration="36.590873864s" podCreationTimestamp="2026-03-17 01:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:44.583742647 +0000 UTC m=+1210.215975625" watchObservedRunningTime="2026-03-17 01:29:44.590873864 +0000 UTC m=+1210.223106842" Mar 17 01:29:45 crc kubenswrapper[4735]: I0317 01:29:45.578419 4735 generic.go:334] "Generic (PLEG): container finished" podID="4010b52e-ec5e-431e-9a94-48a03dfdcce6" containerID="c21e38ee0e20a18a4196446c8deb43608008ad76851d0decefa2fa915c851264" exitCode=0 Mar 17 01:29:45 crc kubenswrapper[4735]: I0317 01:29:45.578474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-94fhf" event={"ID":"4010b52e-ec5e-431e-9a94-48a03dfdcce6","Type":"ContainerDied","Data":"c21e38ee0e20a18a4196446c8deb43608008ad76851d0decefa2fa915c851264"} Mar 17 01:29:46 crc kubenswrapper[4735]: I0317 01:29:46.587757 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c5b000f-5d9d-4d80-b979-725e03851ba6" containerID="68cb4547e382ebf1c68944841418b06d8ebdd75b3cce0044083697df9e27f405" exitCode=0 Mar 17 01:29:46 crc kubenswrapper[4735]: I0317 01:29:46.587842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s6vdx" event={"ID":"8c5b000f-5d9d-4d80-b979-725e03851ba6","Type":"ContainerDied","Data":"68cb4547e382ebf1c68944841418b06d8ebdd75b3cce0044083697df9e27f405"} Mar 17 01:29:46 crc kubenswrapper[4735]: I0317 01:29:46.597165 4735 generic.go:334] "Generic (PLEG): container finished" podID="322cb9be-148e-4dc0-8e9c-ae716ed6925f" containerID="26f9ee346dc5c7523fb01f1805609575ccf514e9b54d884ebae6ca9193f32627" exitCode=0 Mar 17 01:29:46 crc kubenswrapper[4735]: I0317 01:29:46.597225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cw9fr" event={"ID":"322cb9be-148e-4dc0-8e9c-ae716ed6925f","Type":"ContainerDied","Data":"26f9ee346dc5c7523fb01f1805609575ccf514e9b54d884ebae6ca9193f32627"} Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.302297 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.314169 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.314484 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.444813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.444863 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts\") pod \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.444940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mrl\" (UniqueName: \"kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.444974 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data\") pod \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.444993 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data\") pod \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445074 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs\") pod \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445093 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle\") pod \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw4wm\" (UniqueName: \"kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm\") pod \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\" (UID: \"4010b52e-ec5e-431e-9a94-48a03dfdcce6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc\") pod \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445265 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle\") pod \"8c5b000f-5d9d-4d80-b979-725e03851ba6\" (UID: \"8c5b000f-5d9d-4d80-b979-725e03851ba6\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.445306 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle\") pod \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\" (UID: \"322cb9be-148e-4dc0-8e9c-ae716ed6925f\") " Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.446359 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs" (OuterVolumeSpecName: "logs") pod "4010b52e-ec5e-431e-9a94-48a03dfdcce6" (UID: "4010b52e-ec5e-431e-9a94-48a03dfdcce6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.462405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm" (OuterVolumeSpecName: "kube-api-access-bw4wm") pod "4010b52e-ec5e-431e-9a94-48a03dfdcce6" (UID: "4010b52e-ec5e-431e-9a94-48a03dfdcce6"). InnerVolumeSpecName "kube-api-access-bw4wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.463336 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "322cb9be-148e-4dc0-8e9c-ae716ed6925f" (UID: "322cb9be-148e-4dc0-8e9c-ae716ed6925f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.473953 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl" (OuterVolumeSpecName: "kube-api-access-h2mrl") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "kube-api-access-h2mrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.474098 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.482683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.483995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts" (OuterVolumeSpecName: "scripts") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.484400 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc" (OuterVolumeSpecName: "kube-api-access-2x2nc") pod "322cb9be-148e-4dc0-8e9c-ae716ed6925f" (UID: "322cb9be-148e-4dc0-8e9c-ae716ed6925f"). InnerVolumeSpecName "kube-api-access-2x2nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.486948 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.517021 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts" (OuterVolumeSpecName: "scripts") pod "4010b52e-ec5e-431e-9a94-48a03dfdcce6" (UID: "4010b52e-ec5e-431e-9a94-48a03dfdcce6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.532993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data" (OuterVolumeSpecName: "config-data") pod "4010b52e-ec5e-431e-9a94-48a03dfdcce6" (UID: "4010b52e-ec5e-431e-9a94-48a03dfdcce6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.547480 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mrl\" (UniqueName: \"kubernetes.io/projected/8c5b000f-5d9d-4d80-b979-725e03851ba6-kube-api-access-h2mrl\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.547716 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.547818 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.547913 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548038 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4010b52e-ec5e-431e-9a94-48a03dfdcce6-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548135 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548282 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw4wm\" (UniqueName: \"kubernetes.io/projected/4010b52e-ec5e-431e-9a94-48a03dfdcce6-kube-api-access-bw4wm\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548303 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/322cb9be-148e-4dc0-8e9c-ae716ed6925f-kube-api-access-2x2nc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548315 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.548325 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.553990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4010b52e-ec5e-431e-9a94-48a03dfdcce6" (UID: "4010b52e-ec5e-431e-9a94-48a03dfdcce6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.560365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "322cb9be-148e-4dc0-8e9c-ae716ed6925f" (UID: "322cb9be-148e-4dc0-8e9c-ae716ed6925f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.588329 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.588532 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="dnsmasq-dns" containerID="cri-o://bb2c4098e1e6d7df14476cb04bda10bf346296350b7bbaf6ac2863084548a391" gracePeriod=10 Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.617152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data" (OuterVolumeSpecName: "config-data") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.620538 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c5b000f-5d9d-4d80-b979-725e03851ba6" (UID: "8c5b000f-5d9d-4d80-b979-725e03851ba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.648464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-94fhf" event={"ID":"4010b52e-ec5e-431e-9a94-48a03dfdcce6","Type":"ContainerDied","Data":"1d2a5bf899a2cd868e4e65f248cc9a0d7b0d6f881aa9969d99d9bf1647d7cb58"} Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.648500 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d2a5bf899a2cd868e4e65f248cc9a0d7b0d6f881aa9969d99d9bf1647d7cb58" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.648562 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-94fhf" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.651571 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.651603 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b000f-5d9d-4d80-b979-725e03851ba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.651614 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/322cb9be-148e-4dc0-8e9c-ae716ed6925f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.651622 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4010b52e-ec5e-431e-9a94-48a03dfdcce6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.658828 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.658938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.667190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cw9fr" event={"ID":"322cb9be-148e-4dc0-8e9c-ae716ed6925f","Type":"ContainerDied","Data":"a7972b5f934a0773f836db290b17b1e4f415cb080686081d07db6446fbe6aaa6"} Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.667371 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7972b5f934a0773f836db290b17b1e4f415cb080686081d07db6446fbe6aaa6" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.667555 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cw9fr" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.688080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s6vdx" event={"ID":"8c5b000f-5d9d-4d80-b979-725e03851ba6","Type":"ContainerDied","Data":"8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2"} Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.688119 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b58388fc4642a2bd084816d1e0421a7b413bd26a52c47c1c96e113310a055d2" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.688202 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s6vdx" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766129 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9455b57b9-66mtr"] Mar 17 01:29:48 crc kubenswrapper[4735]: E0317 01:29:48.766531 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010b52e-ec5e-431e-9a94-48a03dfdcce6" containerName="placement-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766571 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010b52e-ec5e-431e-9a94-48a03dfdcce6" containerName="placement-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: E0317 01:29:48.766583 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b000f-5d9d-4d80-b979-725e03851ba6" containerName="keystone-bootstrap" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766589 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b000f-5d9d-4d80-b979-725e03851ba6" containerName="keystone-bootstrap" Mar 17 01:29:48 crc kubenswrapper[4735]: E0317 01:29:48.766614 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322cb9be-148e-4dc0-8e9c-ae716ed6925f" containerName="barbican-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766620 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="322cb9be-148e-4dc0-8e9c-ae716ed6925f" containerName="barbican-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766835 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="322cb9be-148e-4dc0-8e9c-ae716ed6925f" containerName="barbican-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766877 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5b000f-5d9d-4d80-b979-725e03851ba6" containerName="keystone-bootstrap" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.766891 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010b52e-ec5e-431e-9a94-48a03dfdcce6" containerName="placement-db-sync" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.767614 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.773177 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.773365 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.773536 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.773913 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.774136 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.774260 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p2vmn" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.807762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9455b57b9-66mtr"] Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.818469 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:48 crc kubenswrapper[4735]: I0317 01:29:48.821488 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.856894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-fernet-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-scripts\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rfhj\" (UniqueName: \"kubernetes.io/projected/5a6d3039-699c-477b-a835-2b6fa2709dde-kube-api-access-7rfhj\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-combined-ca-bundle\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857242 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-public-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-credential-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-config-data\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.857343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-internal-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.907885 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.909456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.919428 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.919641 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.919741 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nk4hc" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.922588 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-public-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-credential-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992904 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992938 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw422\" (UniqueName: \"kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-config-data\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.992995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-internal-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993068 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-fernet-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-scripts\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rfhj\" (UniqueName: \"kubernetes.io/projected/5a6d3039-699c-477b-a835-2b6fa2709dde-kube-api-access-7rfhj\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-combined-ca-bundle\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993187 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:48.993203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.009632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-config-data\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.009881 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-scripts\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.011698 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-internal-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.012041 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-public-tls-certs\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.012418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-credential-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.018703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-combined-ca-bundle\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.018742 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-55996449db-5bw4p"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.020493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a6d3039-699c-477b-a835-2b6fa2709dde-fernet-keys\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.020544 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.027088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rfhj\" (UniqueName: \"kubernetes.io/projected/5a6d3039-699c-477b-a835-2b6fa2709dde-kube-api-access-7rfhj\") pod \"keystone-9455b57b9-66mtr\" (UID: \"5a6d3039-699c-477b-a835-2b6fa2709dde\") " pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.049632 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.052792 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55996449db-5bw4p"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.100834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.100972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.101000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.101050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.101082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw422\" (UniqueName: \"kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.103494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.121110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.141696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.142733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.153356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw422\" (UniqueName: \"kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422\") pod \"barbican-worker-8d99585df-2dfcf\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.204773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data-custom\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.205917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-combined-ca-bundle\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.205995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xq45\" (UniqueName: \"kubernetes.io/projected/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-kube-api-access-5xq45\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.206019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.206069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-logs\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.307163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xq45\" (UniqueName: \"kubernetes.io/projected/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-kube-api-access-5xq45\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.307196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.307227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-logs\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.307358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data-custom\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.307433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-combined-ca-bundle\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.318435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-logs\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.331782 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.341345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-config-data-custom\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.367966 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-764bc646f-l9wnd"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.390273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-combined-ca-bundle\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.392935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.392954 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.392965 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.394076 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-764bc646f-l9wnd"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.394096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.394109 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.398202 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.399648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.399883 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.399971 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.401610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xq45\" (UniqueName: \"kubernetes.io/projected/8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f-kube-api-access-5xq45\") pod \"barbican-keystone-listener-55996449db-5bw4p\" (UID: \"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f\") " pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.403252 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.429526 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.492456 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.520840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data-custom\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.520923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6r6\" (UniqueName: \"kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.520966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-combined-ca-bundle\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lp56\" (UniqueName: \"kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521261 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcdv\" (UniqueName: \"kubernetes.io/projected/e4e10703-a66a-4bb3-b523-8d5d6e772c04-kube-api-access-txcdv\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.521365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e10703-a66a-4bb3-b523-8d5d6e772c04-logs\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.538403 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.586063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.593279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.654727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.655405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.655511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.655724 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.655820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.655913 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lp56\" (UniqueName: \"kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656109 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcdv\" (UniqueName: \"kubernetes.io/projected/e4e10703-a66a-4bb3-b523-8d5d6e772c04-kube-api-access-txcdv\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656374 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656462 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e10703-a66a-4bb3-b523-8d5d6e772c04-logs\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data-custom\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6r6\" (UniqueName: \"kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.656927 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.657058 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-combined-ca-bundle\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.657151 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.660182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e10703-a66a-4bb3-b523-8d5d6e772c04-logs\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.670625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.671173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.671648 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.674465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.677591 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.678184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.678828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.680881 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.686036 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9b8f4bc48-rdtm4"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.687554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.705618 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.705661 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.705829 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.705883 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.705830 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfjgp" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.743186 4735 generic.go:334] "Generic (PLEG): container finished" podID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerID="bb2c4098e1e6d7df14476cb04bda10bf346296350b7bbaf6ac2863084548a391" exitCode=0 Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.748120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" event={"ID":"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a","Type":"ContainerDied","Data":"bb2c4098e1e6d7df14476cb04bda10bf346296350b7bbaf6ac2863084548a391"} Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.749442 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.749458 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.750182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-combined-ca-bundle\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.751626 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.752770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e10703-a66a-4bb3-b523-8d5d6e772c04-config-data-custom\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.760798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-config-data\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.760847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-internal-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.761035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-combined-ca-bundle\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.762575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lp56\" (UniqueName: \"kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56\") pod \"dnsmasq-dns-7d7b898ff7-5l74x\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.767661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-logs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.767736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-public-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.767763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-scripts\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.767784 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twscx\" (UniqueName: \"kubernetes.io/projected/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-kube-api-access-twscx\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.768768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcdv\" (UniqueName: \"kubernetes.io/projected/e4e10703-a66a-4bb3-b523-8d5d6e772c04-kube-api-access-txcdv\") pod \"barbican-worker-764bc646f-l9wnd\" (UID: \"e4e10703-a66a-4bb3-b523-8d5d6e772c04\") " pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.773295 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9b8f4bc48-rdtm4"] Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.819429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6r6\" (UniqueName: \"kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6\") pod \"barbican-api-5cddfdcf58-2zqd4\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.820910 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.840736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-combined-ca-bundle\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874481 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-logs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-public-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874577 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-scripts\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twscx\" (UniqueName: \"kubernetes.io/projected/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-kube-api-access-twscx\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-config-data\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.874923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-internal-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.875030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-logs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.885044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-scripts\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.885257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-internal-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.885396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-config-data\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.889384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-public-tls-certs\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.890393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-combined-ca-bundle\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.909563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twscx\" (UniqueName: \"kubernetes.io/projected/d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a-kube-api-access-twscx\") pod \"placement-9b8f4bc48-rdtm4\" (UID: \"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a\") " pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:49 crc kubenswrapper[4735]: I0317 01:29:49.971413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-764bc646f-l9wnd" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.157710 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.234533 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.234585 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.278205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.282545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.750317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.751096 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:50 crc kubenswrapper[4735]: I0317 01:29:50.798386 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 17 01:29:51 crc kubenswrapper[4735]: I0317 01:29:51.756934 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:51 crc kubenswrapper[4735]: I0317 01:29:51.756958 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.473675 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-648b74c4fd-9pw4q"] Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.475253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.482804 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.482870 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.486021 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-648b74c4fd-9pw4q"] Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526095 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-combined-ca-bundle\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data-custom\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526242 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk986\" (UniqueName: \"kubernetes.io/projected/f33736df-541b-4d36-bde6-08767c233625-kube-api-access-hk986\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33736df-541b-4d36-bde6-08767c233625-logs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-public-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.526330 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-internal-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk986\" (UniqueName: \"kubernetes.io/projected/f33736df-541b-4d36-bde6-08767c233625-kube-api-access-hk986\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33736df-541b-4d36-bde6-08767c233625-logs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-public-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-internal-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-combined-ca-bundle\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data-custom\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.628789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.637087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.638629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33736df-541b-4d36-bde6-08767c233625-logs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.639927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-internal-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.640725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-public-tls-certs\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.642561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-combined-ca-bundle\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.654311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f33736df-541b-4d36-bde6-08767c233625-config-data-custom\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.670361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk986\" (UniqueName: \"kubernetes.io/projected/f33736df-541b-4d36-bde6-08767c233625-kube-api-access-hk986\") pod \"barbican-api-648b74c4fd-9pw4q\" (UID: \"f33736df-541b-4d36-bde6-08767c233625\") " pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.792063 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.792085 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:52 crc kubenswrapper[4735]: I0317 01:29:52.795263 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.139028 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188816 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.188975 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6kgn\" (UniqueName: \"kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn\") pod \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\" (UID: \"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a\") " Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.283690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn" (OuterVolumeSpecName: "kube-api-access-b6kgn") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "kube-api-access-b6kgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.294342 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6kgn\" (UniqueName: \"kubernetes.io/projected/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-kube-api-access-b6kgn\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.583097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.606581 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.697842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.709155 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.727409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config" (OuterVolumeSpecName: "config") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.728048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.793326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" (UID: "2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.804538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" event={"ID":"2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a","Type":"ContainerDied","Data":"7b6ba21185d179c465e47e5c3afbd1d5d66af8e199b1b5bdb6a32fdae5fd7189"} Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.804585 4735 scope.go:117] "RemoveContainer" containerID="bb2c4098e1e6d7df14476cb04bda10bf346296350b7bbaf6ac2863084548a391" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.804749 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8b7566d9-cvgwg" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.810303 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.810326 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.810339 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.921536 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.929049 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b8b7566d9-cvgwg"] Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.935116 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9455b57b9-66mtr"] Mar 17 01:29:53 crc kubenswrapper[4735]: I0317 01:29:53.945383 4735 scope.go:117] "RemoveContainer" containerID="e815b2dc70a2e5fe384e47f3c1da4cc36f7728f0190229700bf4866bc2a79f3b" Mar 17 01:29:53 crc kubenswrapper[4735]: W0317 01:29:53.970400 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a6d3039_699c_477b_a835_2b6fa2709dde.slice/crio-25a7959c4a1f8025d201a5267270221d0a543f918849b0d59997cdd68dd831f1 WatchSource:0}: Error finding container 25a7959c4a1f8025d201a5267270221d0a543f918849b0d59997cdd68dd831f1: Status 404 returned error can't find the container with id 25a7959c4a1f8025d201a5267270221d0a543f918849b0d59997cdd68dd831f1 Mar 17 01:29:54 crc kubenswrapper[4735]: E0317 01:29:54.141764 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e759bab_3b6e_4ff2_88f5_0bb8bef00d9a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e759bab_3b6e_4ff2_88f5_0bb8bef00d9a.slice/crio-7b6ba21185d179c465e47e5c3afbd1d5d66af8e199b1b5bdb6a32fdae5fd7189\": RecentStats: unable to find data in memory cache]" Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.291961 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.299110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.311585 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9b8f4bc48-rdtm4"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.311626 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-764bc646f-l9wnd"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.631340 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-648b74c4fd-9pw4q"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.658843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55996449db-5bw4p"] Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.662598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:29:54 crc kubenswrapper[4735]: W0317 01:29:54.686127 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33736df_541b_4d36_bde6_08767c233625.slice/crio-c973aade30fcb428ccb8b3adad7f5d03321e87e798b853793926bcdfa4cd40bb WatchSource:0}: Error finding container c973aade30fcb428ccb8b3adad7f5d03321e87e798b853793926bcdfa4cd40bb: Status 404 returned error can't find the container with id c973aade30fcb428ccb8b3adad7f5d03321e87e798b853793926bcdfa4cd40bb Mar 17 01:29:54 crc kubenswrapper[4735]: W0317 01:29:54.759059 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd863b4_c61a_4c9b_9a72_f2fe04e3ce3f.slice/crio-7e4250164ba96a668474122870c44b79843a9aed115e62878433c9ae161b3d42 WatchSource:0}: Error finding container 7e4250164ba96a668474122870c44b79843a9aed115e62878433c9ae161b3d42: Status 404 returned error can't find the container with id 7e4250164ba96a668474122870c44b79843a9aed115e62878433c9ae161b3d42 Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.853283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerStarted","Data":"ee9b4dc2d90a8d5b2f8ae25d4d9d79129accc330feb8a94810fc2042d18fde74"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.854841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-97m76" event={"ID":"f4f6de0a-1a49-4470-80f3-5c807d5899a4","Type":"ContainerStarted","Data":"871c342605904236a032b7990bc765f65b061a248331a3b5c1846c04f84be997"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.859921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerStarted","Data":"8f5f1b727af3536437684fa1cb307383b03ad29a3b82534376f9026226d02b68"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.861355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9b8f4bc48-rdtm4" event={"ID":"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a","Type":"ContainerStarted","Data":"e83a4672fd79e2af6bc3e824a905e6d001c230b1f8e27f14377dc241111fbff6"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.872242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" event={"ID":"860ecebd-d15d-483b-bfbd-8c5c80682d6f","Type":"ContainerStarted","Data":"1f15b6059f4d58f462c12646f354432a5b315f5caa8880cd31cb91e83a7b0e28"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.876724 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-97m76" podStartSLOduration=3.881106267 podStartE2EDuration="55.876708143s" podCreationTimestamp="2026-03-17 01:28:59 +0000 UTC" firstStartedPulling="2026-03-17 01:29:01.107061424 +0000 UTC m=+1166.739294402" lastFinishedPulling="2026-03-17 01:29:53.1026633 +0000 UTC m=+1218.734896278" observedRunningTime="2026-03-17 01:29:54.870931911 +0000 UTC m=+1220.503164889" watchObservedRunningTime="2026-03-17 01:29:54.876708143 +0000 UTC m=+1220.508941121" Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.889662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerStarted","Data":"6f5661642ddb0de2d974842316d1dac4abe348c5cd5d1b2f0873af08e2ac6f98"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.898462 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-764bc646f-l9wnd" event={"ID":"e4e10703-a66a-4bb3-b523-8d5d6e772c04","Type":"ContainerStarted","Data":"8105fb177d1c43efee6a36d4c66eb64a537c6b80bc65fd140aeeb97f1f2adbd7"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.902517 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648b74c4fd-9pw4q" event={"ID":"f33736df-541b-4d36-bde6-08767c233625","Type":"ContainerStarted","Data":"c973aade30fcb428ccb8b3adad7f5d03321e87e798b853793926bcdfa4cd40bb"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.907491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" event={"ID":"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f","Type":"ContainerStarted","Data":"7e4250164ba96a668474122870c44b79843a9aed115e62878433c9ae161b3d42"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.915036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9455b57b9-66mtr" event={"ID":"5a6d3039-699c-477b-a835-2b6fa2709dde","Type":"ContainerStarted","Data":"b5ae353d11fb459ddd448670098f0f0cfba307d257a4e0a5bb597ed70cafced2"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.915074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9455b57b9-66mtr" event={"ID":"5a6d3039-699c-477b-a835-2b6fa2709dde","Type":"ContainerStarted","Data":"25a7959c4a1f8025d201a5267270221d0a543f918849b0d59997cdd68dd831f1"} Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.916269 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:29:54 crc kubenswrapper[4735]: I0317 01:29:54.943043 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9455b57b9-66mtr" podStartSLOduration=6.943017767 podStartE2EDuration="6.943017767s" podCreationTimestamp="2026-03-17 01:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:54.940523795 +0000 UTC m=+1220.572756763" watchObservedRunningTime="2026-03-17 01:29:54.943017767 +0000 UTC m=+1220.575250745" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.119127 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" path="/var/lib/kubelet/pods/2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a/volumes" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.937211 4735 generic.go:334] "Generic (PLEG): container finished" podID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerID="1d80434aca65f06146d11ceea9a66dd01ec651bc1cc2840555b89f682b28bff7" exitCode=0 Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.937452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" event={"ID":"860ecebd-d15d-483b-bfbd-8c5c80682d6f","Type":"ContainerDied","Data":"1d80434aca65f06146d11ceea9a66dd01ec651bc1cc2840555b89f682b28bff7"} Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.943667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerStarted","Data":"4721eb628ca2c6f264ae7895f5071a05345b1f9fd8758b121631c952be5dcd23"} Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.943700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerStarted","Data":"fcda2be56e91b43d4bcda1751a6fdd560fc5f32ad0cc8f5dfa28f7bc141fbd13"} Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.944355 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.944383 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.980679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648b74c4fd-9pw4q" event={"ID":"f33736df-541b-4d36-bde6-08767c233625","Type":"ContainerStarted","Data":"8d55f1bd91f8130a564c68472acccb827656e28460c2993271f7335ff8e84d3d"} Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.980732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648b74c4fd-9pw4q" event={"ID":"f33736df-541b-4d36-bde6-08767c233625","Type":"ContainerStarted","Data":"922c82c1801076ab104d698619622e20ed5e599fc4180821fb0a766e3f3cfd67"} Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.980988 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.981257 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:29:55 crc kubenswrapper[4735]: I0317 01:29:55.991066 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podStartSLOduration=6.9910491310000005 podStartE2EDuration="6.991049131s" podCreationTimestamp="2026-03-17 01:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:55.984356986 +0000 UTC m=+1221.616589964" watchObservedRunningTime="2026-03-17 01:29:55.991049131 +0000 UTC m=+1221.623282109" Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.001647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9b8f4bc48-rdtm4" event={"ID":"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a","Type":"ContainerStarted","Data":"f0ec02d81f5c168bc00c8c8da87411efc073f90b1d6e62a1aec0639499940200"} Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.001687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9b8f4bc48-rdtm4" event={"ID":"d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a","Type":"ContainerStarted","Data":"a3c60073ad78e5a445e4fd0f27c1144896c71769333cac78de373166fbd2af12"} Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.002788 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.002816 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.012056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bl4z7" event={"ID":"bcfd83df-2abe-43e9-ab0b-88ec269eb204","Type":"ContainerStarted","Data":"22790db9bae7a752fa7d704f810b22121b9b82d55b6ac4f75bff5e5f2172eeb3"} Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.018622 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-648b74c4fd-9pw4q" podStartSLOduration=4.01860416 podStartE2EDuration="4.01860416s" podCreationTimestamp="2026-03-17 01:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:56.009905526 +0000 UTC m=+1221.642138504" watchObservedRunningTime="2026-03-17 01:29:56.01860416 +0000 UTC m=+1221.650837138" Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.086334 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9b8f4bc48-rdtm4" podStartSLOduration=7.086315149 podStartE2EDuration="7.086315149s" podCreationTimestamp="2026-03-17 01:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:56.062302167 +0000 UTC m=+1221.694535145" watchObservedRunningTime="2026-03-17 01:29:56.086315149 +0000 UTC m=+1221.718548127" Mar 17 01:29:56 crc kubenswrapper[4735]: I0317 01:29:56.114445 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bl4z7" podStartSLOduration=5.941297081 podStartE2EDuration="57.114427651s" podCreationTimestamp="2026-03-17 01:28:59 +0000 UTC" firstStartedPulling="2026-03-17 01:29:01.919926963 +0000 UTC m=+1167.552159941" lastFinishedPulling="2026-03-17 01:29:53.093057533 +0000 UTC m=+1218.725290511" observedRunningTime="2026-03-17 01:29:56.100918678 +0000 UTC m=+1221.733151656" watchObservedRunningTime="2026-03-17 01:29:56.114427651 +0000 UTC m=+1221.746660619" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.004790 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.005159 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.060204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" event={"ID":"860ecebd-d15d-483b-bfbd-8c5c80682d6f","Type":"ContainerStarted","Data":"0febb4cda137a9f64042f0ee793b17229c3087501a644197f2c50195f4802dc1"} Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.061014 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.083093 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" podStartSLOduration=8.08308041 podStartE2EDuration="8.08308041s" podCreationTimestamp="2026-03-17 01:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:29:57.079265256 +0000 UTC m=+1222.711498234" watchObservedRunningTime="2026-03-17 01:29:57.08308041 +0000 UTC m=+1222.715313388" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.220533 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.220641 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.227326 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:29:57 crc kubenswrapper[4735]: I0317 01:29:57.674461 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:29:58 crc kubenswrapper[4735]: I0317 01:29:58.659462 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 01:29:58 crc kubenswrapper[4735]: I0317 01:29:58.817817 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76cdf95cd8-vx5pd" podUID="fc3f6d90-40e7-4962-b788-1e9924edb48f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.102632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" event={"ID":"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f","Type":"ContainerStarted","Data":"eb47e29142df129b0eee304d827cc20aa13438cff350b77b0d0bb2781eb21672"} Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.109149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerStarted","Data":"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9"} Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.167717 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf"] Mar 17 01:30:00 crc kubenswrapper[4735]: E0317 01:30:00.183295 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="init" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.183318 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="init" Mar 17 01:30:00 crc kubenswrapper[4735]: E0317 01:30:00.183358 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="dnsmasq-dns" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.183365 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="dnsmasq-dns" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.183532 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e759bab-3b6e-4ff2-88f5-0bb8bef00d9a" containerName="dnsmasq-dns" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.184142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.187109 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.187307 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.213034 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf"] Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.260086 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561850-jpv8n"] Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.261203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.268585 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.268782 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.268937 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.294155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-jpv8n"] Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.326622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvghw\" (UniqueName: \"kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw\") pod \"auto-csr-approver-29561850-jpv8n\" (UID: \"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111\") " pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.326741 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.326833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhcr\" (UniqueName: \"kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.326871 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.429326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.429408 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfhcr\" (UniqueName: \"kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.429455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.429501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvghw\" (UniqueName: \"kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw\") pod \"auto-csr-approver-29561850-jpv8n\" (UID: \"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111\") " pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.430736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.443568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.456662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvghw\" (UniqueName: \"kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw\") pod \"auto-csr-approver-29561850-jpv8n\" (UID: \"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111\") " pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.506640 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfhcr\" (UniqueName: \"kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr\") pod \"collect-profiles-29561850-vlxtf\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.521485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:00 crc kubenswrapper[4735]: I0317 01:30:00.637421 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.150351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerStarted","Data":"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5"} Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.172976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-764bc646f-l9wnd" event={"ID":"e4e10703-a66a-4bb3-b523-8d5d6e772c04","Type":"ContainerStarted","Data":"acca0606bcbed413795fba1d5cd6304c1c8c2c07126f039ccfdb1e78b446c3c4"} Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.173022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-764bc646f-l9wnd" event={"ID":"e4e10703-a66a-4bb3-b523-8d5d6e772c04","Type":"ContainerStarted","Data":"64f88ac2bf7bad1173b82974bea4155759a6a23e30ab0ecf26ecbaf899c0588d"} Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.180479 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8d99585df-2dfcf" podStartSLOduration=7.941581092 podStartE2EDuration="13.180461602s" podCreationTimestamp="2026-03-17 01:29:48 +0000 UTC" firstStartedPulling="2026-03-17 01:29:54.262652572 +0000 UTC m=+1219.894885540" lastFinishedPulling="2026-03-17 01:29:59.501533072 +0000 UTC m=+1225.133766050" observedRunningTime="2026-03-17 01:30:01.171199423 +0000 UTC m=+1226.803432401" watchObservedRunningTime="2026-03-17 01:30:01.180461602 +0000 UTC m=+1226.812694580" Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.186447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" event={"ID":"8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f","Type":"ContainerStarted","Data":"08f94107a15ee2d80a47022b811603e7ad315efe8c0de487ca23135cbc327a01"} Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.192648 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf"] Mar 17 01:30:01 crc kubenswrapper[4735]: W0317 01:30:01.201895 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f202568_8dcd_4ba1_8b79_a80560cfcd1a.slice/crio-3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7 WatchSource:0}: Error finding container 3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7: Status 404 returned error can't find the container with id 3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7 Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.223978 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-764bc646f-l9wnd" podStartSLOduration=7.099049153 podStartE2EDuration="12.223961344s" podCreationTimestamp="2026-03-17 01:29:49 +0000 UTC" firstStartedPulling="2026-03-17 01:29:54.382331582 +0000 UTC m=+1220.014564560" lastFinishedPulling="2026-03-17 01:29:59.507243773 +0000 UTC m=+1225.139476751" observedRunningTime="2026-03-17 01:30:01.220265762 +0000 UTC m=+1226.852498740" watchObservedRunningTime="2026-03-17 01:30:01.223961344 +0000 UTC m=+1226.856194322" Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.244698 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-55996449db-5bw4p" podStartSLOduration=8.533806345 podStartE2EDuration="13.244681464s" podCreationTimestamp="2026-03-17 01:29:48 +0000 UTC" firstStartedPulling="2026-03-17 01:29:54.791805371 +0000 UTC m=+1220.424038349" lastFinishedPulling="2026-03-17 01:29:59.50268049 +0000 UTC m=+1225.134913468" observedRunningTime="2026-03-17 01:30:01.238620285 +0000 UTC m=+1226.870853253" watchObservedRunningTime="2026-03-17 01:30:01.244681464 +0000 UTC m=+1226.876914432" Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.246416 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:30:01 crc kubenswrapper[4735]: I0317 01:30:01.311032 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-jpv8n"] Mar 17 01:30:02 crc kubenswrapper[4735]: I0317 01:30:02.208457 4735 generic.go:334] "Generic (PLEG): container finished" podID="6f202568-8dcd-4ba1-8b79-a80560cfcd1a" containerID="df368615a720d15b95af785eff55f70fe59fc4a47f7f825aefce5b12f370944b" exitCode=0 Mar 17 01:30:02 crc kubenswrapper[4735]: I0317 01:30:02.208722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" event={"ID":"6f202568-8dcd-4ba1-8b79-a80560cfcd1a","Type":"ContainerDied","Data":"df368615a720d15b95af785eff55f70fe59fc4a47f7f825aefce5b12f370944b"} Mar 17 01:30:02 crc kubenswrapper[4735]: I0317 01:30:02.208745 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" event={"ID":"6f202568-8dcd-4ba1-8b79-a80560cfcd1a","Type":"ContainerStarted","Data":"3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7"} Mar 17 01:30:02 crc kubenswrapper[4735]: I0317 01:30:02.214578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" event={"ID":"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111","Type":"ContainerStarted","Data":"4d0001a3753137dbace70e98be9f4ca970e81fdc6b781dd624f98afdf9e6da39"} Mar 17 01:30:03 crc kubenswrapper[4735]: I0317 01:30:03.225040 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" containerID="871c342605904236a032b7990bc765f65b061a248331a3b5c1846c04f84be997" exitCode=0 Mar 17 01:30:03 crc kubenswrapper[4735]: I0317 01:30:03.225125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-97m76" event={"ID":"f4f6de0a-1a49-4470-80f3-5c807d5899a4","Type":"ContainerDied","Data":"871c342605904236a032b7990bc765f65b061a248331a3b5c1846c04f84be997"} Mar 17 01:30:03 crc kubenswrapper[4735]: I0317 01:30:03.225922 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-8d99585df-2dfcf" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker" containerID="cri-o://5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5" gracePeriod=30 Mar 17 01:30:03 crc kubenswrapper[4735]: I0317 01:30:03.225838 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-8d99585df-2dfcf" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker-log" containerID="cri-o://7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9" gracePeriod=30 Mar 17 01:30:03 crc kubenswrapper[4735]: I0317 01:30:03.929464 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.236655 4735 generic.go:334] "Generic (PLEG): container finished" podID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerID="7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9" exitCode=143 Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.236840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerDied","Data":"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9"} Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.823005 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.870164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.884634 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:30:04 crc kubenswrapper[4735]: I0317 01:30:04.884840 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" containerID="cri-o://a680d6fbf902d3c23d3b87a260006e1e97ade12bbe5307e7ae17b7161fad046a" gracePeriod=10 Mar 17 01:30:05 crc kubenswrapper[4735]: I0317 01:30:05.261221 4735 generic.go:334] "Generic (PLEG): container finished" podID="9753d2c8-2d80-4019-965a-af117590e79f" containerID="a680d6fbf902d3c23d3b87a260006e1e97ade12bbe5307e7ae17b7161fad046a" exitCode=0 Mar 17 01:30:05 crc kubenswrapper[4735]: I0317 01:30:05.261263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" event={"ID":"9753d2c8-2d80-4019-965a-af117590e79f","Type":"ContainerDied","Data":"a680d6fbf902d3c23d3b87a260006e1e97ade12bbe5307e7ae17b7161fad046a"} Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.269759 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-648b74c4fd-9pw4q" Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.342748 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.348095 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" containerID="cri-o://fcda2be56e91b43d4bcda1751a6fdd560fc5f32ad0cc8f5dfa28f7bc141fbd13" gracePeriod=30 Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.348449 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" containerID="cri-o://4721eb628ca2c6f264ae7895f5071a05345b1f9fd8758b121631c952be5dcd23" gracePeriod=30 Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.361316 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.361327 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.362728 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 17 01:30:06 crc kubenswrapper[4735]: I0317 01:30:06.364977 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 17 01:30:07 crc kubenswrapper[4735]: I0317 01:30:07.280631 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerID="fcda2be56e91b43d4bcda1751a6fdd560fc5f32ad0cc8f5dfa28f7bc141fbd13" exitCode=143 Mar 17 01:30:07 crc kubenswrapper[4735]: I0317 01:30:07.280692 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerDied","Data":"fcda2be56e91b43d4bcda1751a6fdd560fc5f32ad0cc8f5dfa28f7bc141fbd13"} Mar 17 01:30:07 crc kubenswrapper[4735]: I0317 01:30:07.292981 4735 generic.go:334] "Generic (PLEG): container finished" podID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" containerID="22790db9bae7a752fa7d704f810b22121b9b82d55b6ac4f75bff5e5f2172eeb3" exitCode=0 Mar 17 01:30:07 crc kubenswrapper[4735]: I0317 01:30:07.293018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bl4z7" event={"ID":"bcfd83df-2abe-43e9-ab0b-88ec269eb204","Type":"ContainerDied","Data":"22790db9bae7a752fa7d704f810b22121b9b82d55b6ac4f75bff5e5f2172eeb3"} Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.499057 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: connect: connection refused" Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.658414 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.716429 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.830254 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76cdf95cd8-vx5pd" podUID="fc3f6d90-40e7-4962-b788-1e9924edb48f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.996877 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.997147 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77b7845dbf-mbcjv" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-api" containerID="cri-o://7fc5010997d8e172531eb08ef5e624286b7138788890be11979a5133c277fd45" gracePeriod=30 Mar 17 01:30:08 crc kubenswrapper[4735]: I0317 01:30:08.997498 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77b7845dbf-mbcjv" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-httpd" containerID="cri-o://c6f64c6db3dd3e5c32fbbded6c1c35b16caf87a21b0878cd42eb55043dafc2f8" gracePeriod=30 Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.036681 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.038815 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.047069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.050076 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127796 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127845 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrpw\" (UniqueName: \"kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.127879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229563 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrpw\" (UniqueName: \"kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.229683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.236852 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.240388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.250102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.251226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.251253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.258297 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.263294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrpw\" (UniqueName: \"kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw\") pod \"neutron-7bc47cb9c7-j5cjp\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.328449 4735 generic.go:334] "Generic (PLEG): container finished" podID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerID="c6f64c6db3dd3e5c32fbbded6c1c35b16caf87a21b0878cd42eb55043dafc2f8" exitCode=0 Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.328503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerDied","Data":"c6f64c6db3dd3e5c32fbbded6c1c35b16caf87a21b0878cd42eb55043dafc2f8"} Mar 17 01:30:09 crc kubenswrapper[4735]: I0317 01:30:09.377350 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:11 crc kubenswrapper[4735]: I0317 01:30:11.133836 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77b7845dbf-mbcjv" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9696/\": dial tcp 10.217.0.164:9696: connect: connection refused" Mar 17 01:30:11 crc kubenswrapper[4735]: I0317 01:30:11.407990 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.189731 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.196290 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.201136 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-97m76" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318650 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume\") pod \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume\") pod \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tfxm\" (UniqueName: \"kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318748 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfhcr\" (UniqueName: \"kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr\") pod \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\" (UID: \"6f202568-8dcd-4ba1-8b79-a80560cfcd1a\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318796 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle\") pod \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318931 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96m7q\" (UniqueName: \"kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q\") pod \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts\") pod \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\" (UID: \"bcfd83df-2abe-43e9-ab0b-88ec269eb204\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.318972 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data\") pod \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\" (UID: \"f4f6de0a-1a49-4470-80f3-5c807d5899a4\") " Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.326367 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f202568-8dcd-4ba1-8b79-a80560cfcd1a" (UID: "6f202568-8dcd-4ba1-8b79-a80560cfcd1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.331868 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f202568-8dcd-4ba1-8b79-a80560cfcd1a" (UID: "6f202568-8dcd-4ba1-8b79-a80560cfcd1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.332426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts" (OuterVolumeSpecName: "scripts") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.344950 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.358575 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.421091 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr" (OuterVolumeSpecName: "kube-api-access-nfhcr") pod "6f202568-8dcd-4ba1-8b79-a80560cfcd1a" (UID: "6f202568-8dcd-4ba1-8b79-a80560cfcd1a"). InnerVolumeSpecName "kube-api-access-nfhcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.422629 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q" (OuterVolumeSpecName: "kube-api-access-96m7q") pod "f4f6de0a-1a49-4470-80f3-5c807d5899a4" (UID: "f4f6de0a-1a49-4470-80f3-5c807d5899a4"). InnerVolumeSpecName "kube-api-access-96m7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425931 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcfd83df-2abe-43e9-ab0b-88ec269eb204-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425946 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96m7q\" (UniqueName: \"kubernetes.io/projected/f4f6de0a-1a49-4470-80f3-5c807d5899a4-kube-api-access-96m7q\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425957 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425965 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425975 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425983 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.425992 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfhcr\" (UniqueName: \"kubernetes.io/projected/6f202568-8dcd-4ba1-8b79-a80560cfcd1a-kube-api-access-nfhcr\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.428913 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm" (OuterVolumeSpecName: "kube-api-access-9tfxm") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "kube-api-access-9tfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.449047 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-97m76" event={"ID":"f4f6de0a-1a49-4470-80f3-5c807d5899a4","Type":"ContainerDied","Data":"6c59e39f44d59fef9cbfc73d5b4e591e433513362c5f9d2283063b5ee960942f"} Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.449083 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c59e39f44d59fef9cbfc73d5b4e591e433513362c5f9d2283063b5ee960942f" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.449099 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4f6de0a-1a49-4470-80f3-5c807d5899a4" (UID: "f4f6de0a-1a49-4470-80f3-5c807d5899a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.449138 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-97m76" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.453575 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.463155 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bl4z7" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.463355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bl4z7" event={"ID":"bcfd83df-2abe-43e9-ab0b-88ec269eb204","Type":"ContainerDied","Data":"2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5"} Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.463893 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cb12bc5e50c432b00eba6b28ce603c023f7104407adc45f792ab5847e2f38b5" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.479091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" event={"ID":"6f202568-8dcd-4ba1-8b79-a80560cfcd1a","Type":"ContainerDied","Data":"3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7"} Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.479129 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f37bdb2147e5b724aab50fa64c141d136a88e7e978e5daa60223e081d00d7f7" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.479182 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.492313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data" (OuterVolumeSpecName: "config-data") pod "bcfd83df-2abe-43e9-ab0b-88ec269eb204" (UID: "bcfd83df-2abe-43e9-ab0b-88ec269eb204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.527732 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tfxm\" (UniqueName: \"kubernetes.io/projected/bcfd83df-2abe-43e9-ab0b-88ec269eb204-kube-api-access-9tfxm\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.528187 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.528222 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfd83df-2abe-43e9-ab0b-88ec269eb204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.528231 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.536846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data" (OuterVolumeSpecName: "config-data") pod "f4f6de0a-1a49-4470-80f3-5c807d5899a4" (UID: "f4f6de0a-1a49-4470-80f3-5c807d5899a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:12 crc kubenswrapper[4735]: I0317 01:30:12.633278 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f6de0a-1a49-4470-80f3-5c807d5899a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.430892 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:13 crc kubenswrapper[4735]: E0317 01:30:13.431235 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" containerName="cinder-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431247 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" containerName="cinder-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: E0317 01:30:13.431258 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" containerName="heat-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431266 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" containerName="heat-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: E0317 01:30:13.431290 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f202568-8dcd-4ba1-8b79-a80560cfcd1a" containerName="collect-profiles" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431297 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f202568-8dcd-4ba1-8b79-a80560cfcd1a" containerName="collect-profiles" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431448 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" containerName="heat-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431472 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" containerName="cinder-db-sync" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.431484 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f202568-8dcd-4ba1-8b79-a80560cfcd1a" containerName="collect-profiles" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.433800 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.439605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.439791 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x68h4" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.442913 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.443314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.462245 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.553878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.554010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.554082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lk5q\" (UniqueName: \"kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.554161 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.554232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.554424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.597955 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.614698 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.631466 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.688468 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lk5q\" (UniqueName: \"kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.688815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.689022 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.689329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.699156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.703998 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.704295 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.725579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.726592 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.730250 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.734220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lk5q\" (UniqueName: \"kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.747243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.758277 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.818844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.818921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.818947 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.818970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.819021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfm7x\" (UniqueName: \"kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.819040 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.925873 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfm7x\" (UniqueName: \"kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.925924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.926046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.926096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.926130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.926155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.927064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.927505 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.927667 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.928294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.928617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:13 crc kubenswrapper[4735]: I0317 01:30:13.972581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfm7x\" (UniqueName: \"kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x\") pod \"dnsmasq-dns-84bd446449-s7v5p\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.008598 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.010057 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.013818 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.026365 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.129537 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.129820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.130593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ng6v\" (UniqueName: \"kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.130682 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.130779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.130912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.131026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233246 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233320 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ng6v\" (UniqueName: \"kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.233922 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.236722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.241368 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.247647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.252670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.257145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ng6v\" (UniqueName: \"kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v\") pod \"cinder-api-0\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.267469 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.269641 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:33826->10.217.0.170:9311: read: connection reset by peer" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.269681 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:33816->10.217.0.170:9311: read: connection reset by peer" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.270432 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": dial tcp 10.217.0.170:9311: connect: connection refused" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.353484 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.502716 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerID="4721eb628ca2c6f264ae7895f5071a05345b1f9fd8758b121631c952be5dcd23" exitCode=0 Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.503621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerDied","Data":"4721eb628ca2c6f264ae7895f5071a05345b1f9fd8758b121631c952be5dcd23"} Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.506471 4735 generic.go:334] "Generic (PLEG): container finished" podID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerID="7fc5010997d8e172531eb08ef5e624286b7138788890be11979a5133c277fd45" exitCode=0 Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.506511 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerDied","Data":"7fc5010997d8e172531eb08ef5e624286b7138788890be11979a5133c277fd45"} Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.845301 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": dial tcp 10.217.0.170:9311: connect: connection refused" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.845398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:30:14 crc kubenswrapper[4735]: I0317 01:30:14.845382 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cddfdcf58-2zqd4" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": dial tcp 10.217.0.170:9311: connect: connection refused" Mar 17 01:30:15 crc kubenswrapper[4735]: E0317 01:30:15.304554 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Mar 17 01:30:15 crc kubenswrapper[4735]: E0317 01:30:15.304709 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8slq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(72dc2896-7cb6-4385-91a9-ae28c7f8907a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 01:30:15 crc kubenswrapper[4735]: E0317 01:30:15.306044 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.556123 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="ceilometer-notification-agent" containerID="cri-o://42e38c0b456e388752e43b9caa8cfb5117ac218cad2a6a4ed2ea4e36c016b6fa" gracePeriod=30 Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.556667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" event={"ID":"9753d2c8-2d80-4019-965a-af117590e79f","Type":"ContainerDied","Data":"71b42a3379d6afdf6d6df2809665bf8e98cae69e39d4933facf703b0cc6306f9"} Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.556688 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b42a3379d6afdf6d6df2809665bf8e98cae69e39d4933facf703b0cc6306f9" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.557443 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="sg-core" containerID="cri-o://8f5f1b727af3536437684fa1cb307383b03ad29a3b82534376f9026226d02b68" gracePeriod=30 Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.583578 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766523 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766554 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72f4\" (UniqueName: \"kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.766810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0\") pod \"9753d2c8-2d80-4019-965a-af117590e79f\" (UID: \"9753d2c8-2d80-4019-965a-af117590e79f\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.776324 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4" (OuterVolumeSpecName: "kube-api-access-w72f4") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "kube-api-access-w72f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.779977 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72f4\" (UniqueName: \"kubernetes.io/projected/9753d2c8-2d80-4019-965a-af117590e79f-kube-api-access-w72f4\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.858666 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.865540 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.882773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.891888 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.891935 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.891948 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.892156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config" (OuterVolumeSpecName: "config") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.905691 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9753d2c8-2d80-4019-965a-af117590e79f" (UID: "9753d2c8-2d80-4019-965a-af117590e79f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.908090 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.929928 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.993199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle\") pod \"e8bd79d0-d430-4023-8625-41e29f4839a0\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.993267 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data\") pod \"e8bd79d0-d430-4023-8625-41e29f4839a0\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.993418 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom\") pod \"e8bd79d0-d430-4023-8625-41e29f4839a0\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.993534 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs\") pod \"e8bd79d0-d430-4023-8625-41e29f4839a0\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.993552 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v6r6\" (UniqueName: \"kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6\") pod \"e8bd79d0-d430-4023-8625-41e29f4839a0\" (UID: \"e8bd79d0-d430-4023-8625-41e29f4839a0\") " Mar 17 01:30:15 crc kubenswrapper[4735]: I0317 01:30:15.996024 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs" (OuterVolumeSpecName: "logs") pod "e8bd79d0-d430-4023-8625-41e29f4839a0" (UID: "e8bd79d0-d430-4023-8625-41e29f4839a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.000993 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.001022 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9753d2c8-2d80-4019-965a-af117590e79f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.001033 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bd79d0-d430-4023-8625-41e29f4839a0-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.001995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8bd79d0-d430-4023-8625-41e29f4839a0" (UID: "e8bd79d0-d430-4023-8625-41e29f4839a0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.011485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6" (OuterVolumeSpecName: "kube-api-access-7v6r6") pod "e8bd79d0-d430-4023-8625-41e29f4839a0" (UID: "e8bd79d0-d430-4023-8625-41e29f4839a0"). InnerVolumeSpecName "kube-api-access-7v6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.115203 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v6r6\" (UniqueName: \"kubernetes.io/projected/e8bd79d0-d430-4023-8625-41e29f4839a0-kube-api-access-7v6r6\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.115232 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.127607 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8bd79d0-d430-4023-8625-41e29f4839a0" (UID: "e8bd79d0-d430-4023-8625-41e29f4839a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.129622 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.148969 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data" (OuterVolumeSpecName: "config-data") pod "e8bd79d0-d430-4023-8625-41e29f4839a0" (UID: "e8bd79d0-d430-4023-8625-41e29f4839a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.217323 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.218779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.218814 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.220209 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.218851 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.221643 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.222315 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjtmh\" (UniqueName: \"kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.222618 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle\") pod \"7baa4934-72f6-4235-9168-f7dbc175b63d\" (UID: \"7baa4934-72f6-4235-9168-f7dbc175b63d\") " Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.223929 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.225146 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bd79d0-d430-4023-8625-41e29f4839a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.225166 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.249089 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh" (OuterVolumeSpecName: "kube-api-access-fjtmh") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "kube-api-access-fjtmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.257104 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.295943 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.327462 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjtmh\" (UniqueName: \"kubernetes.io/projected/7baa4934-72f6-4235-9168-f7dbc175b63d-kube-api-access-fjtmh\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.327711 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.348059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config" (OuterVolumeSpecName: "config") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.348251 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.372063 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.386372 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.419217 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7baa4934-72f6-4235-9168-f7dbc175b63d" (UID: "7baa4934-72f6-4235-9168-f7dbc175b63d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.438903 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.438937 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.438945 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.438955 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7baa4934-72f6-4235-9168-f7dbc175b63d-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.464661 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.476069 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.514444 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.577836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerStarted","Data":"e1a455e99a01297cb63db057082c1338a6667aa79e56ed627dd5e6d7bf7b8a2d"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.580916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cddfdcf58-2zqd4" event={"ID":"e8bd79d0-d430-4023-8625-41e29f4839a0","Type":"ContainerDied","Data":"ee9b4dc2d90a8d5b2f8ae25d4d9d79129accc330feb8a94810fc2042d18fde74"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.581087 4735 scope.go:117] "RemoveContainer" containerID="4721eb628ca2c6f264ae7895f5071a05345b1f9fd8758b121631c952be5dcd23" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.581053 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cddfdcf58-2zqd4" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.602134 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77b7845dbf-mbcjv" event={"ID":"7baa4934-72f6-4235-9168-f7dbc175b63d","Type":"ContainerDied","Data":"02563a274a4cdb0d52f3f87ac4883a2211dddd48de1c14bc5c16950952743054"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.602167 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77b7845dbf-mbcjv" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.608579 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" event={"ID":"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111","Type":"ContainerStarted","Data":"9ba2851058879bceee1975ee5d1b42d679dbe16c6425dafb892c4528d7b4174e"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.616033 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerStarted","Data":"55c748cf9e62e7fe583b5f75489fd4df8724bae2c940cc295d3745363e953bfb"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.617031 4735 scope.go:117] "RemoveContainer" containerID="fcda2be56e91b43d4bcda1751a6fdd560fc5f32ad0cc8f5dfa28f7bc141fbd13" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.620170 4735 generic.go:334] "Generic (PLEG): container finished" podID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerID="8f5f1b727af3536437684fa1cb307383b03ad29a3b82534376f9026226d02b68" exitCode=2 Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.620215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerDied","Data":"8f5f1b727af3536437684fa1cb307383b03ad29a3b82534376f9026226d02b68"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.624336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" event={"ID":"dee3a5db-edbc-4df1-8087-2c6630561bb2","Type":"ContainerStarted","Data":"b37b408d512ebb85e11244d858e460e27546a0b4543577f1425b52ff33bb9544"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.627369 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.628110 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" podStartSLOduration=2.539363101 podStartE2EDuration="16.628083578s" podCreationTimestamp="2026-03-17 01:30:00 +0000 UTC" firstStartedPulling="2026-03-17 01:30:01.357960926 +0000 UTC m=+1226.990193904" lastFinishedPulling="2026-03-17 01:30:15.446681403 +0000 UTC m=+1241.078914381" observedRunningTime="2026-03-17 01:30:16.620336867 +0000 UTC m=+1242.252569845" watchObservedRunningTime="2026-03-17 01:30:16.628083578 +0000 UTC m=+1242.260316556" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.630568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerStarted","Data":"a7bea2599b06bd74003e27509f5f35aa0e1183cf5ecd4a95f61e2371882e0088"} Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.639099 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.648504 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5cddfdcf58-2zqd4"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.658530 4735 scope.go:117] "RemoveContainer" containerID="c6f64c6db3dd3e5c32fbbded6c1c35b16caf87a21b0878cd42eb55043dafc2f8" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.675978 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.682545 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77b7845dbf-mbcjv"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.691511 4735 scope.go:117] "RemoveContainer" containerID="7fc5010997d8e172531eb08ef5e624286b7138788890be11979a5133c277fd45" Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.693405 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:30:16 crc kubenswrapper[4735]: I0317 01:30:16.702788 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c46b7ff7-gkr8h"] Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.084067 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" path="/var/lib/kubelet/pods/7baa4934-72f6-4235-9168-f7dbc175b63d/volumes" Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.084613 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9753d2c8-2d80-4019-965a-af117590e79f" path="/var/lib/kubelet/pods/9753d2c8-2d80-4019-965a-af117590e79f/volumes" Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.085178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" path="/var/lib/kubelet/pods/e8bd79d0-d430-4023-8625-41e29f4839a0/volumes" Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.642020 4735 generic.go:334] "Generic (PLEG): container finished" podID="0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" containerID="9ba2851058879bceee1975ee5d1b42d679dbe16c6425dafb892c4528d7b4174e" exitCode=0 Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.642646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" event={"ID":"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111","Type":"ContainerDied","Data":"9ba2851058879bceee1975ee5d1b42d679dbe16c6425dafb892c4528d7b4174e"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.660891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerStarted","Data":"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.660932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerStarted","Data":"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.661003 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.664329 4735 generic.go:334] "Generic (PLEG): container finished" podID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerID="89ee5dabb834341edc8d7ed850ecb2e4e4d3b107e11515ad9885ab76894572e8" exitCode=0 Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.664392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" event={"ID":"dee3a5db-edbc-4df1-8087-2c6630561bb2","Type":"ContainerDied","Data":"89ee5dabb834341edc8d7ed850ecb2e4e4d3b107e11515ad9885ab76894572e8"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.668276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerStarted","Data":"f5c9a06c68e6a6eb803cba13ca97574849ddbd30f734bf69e3fefc9042617f62"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.670405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerStarted","Data":"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e"} Mar 17 01:30:17 crc kubenswrapper[4735]: I0317 01:30:17.686064 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bc47cb9c7-j5cjp" podStartSLOduration=8.686044679 podStartE2EDuration="8.686044679s" podCreationTimestamp="2026-03-17 01:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:17.678737739 +0000 UTC m=+1243.310970717" watchObservedRunningTime="2026-03-17 01:30:17.686044679 +0000 UTC m=+1243.318277657" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.483746 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c46b7ff7-gkr8h" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: i/o timeout" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.695347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerStarted","Data":"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229"} Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.695414 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api-log" containerID="cri-o://d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" gracePeriod=30 Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.695492 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api" containerID="cri-o://4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" gracePeriod=30 Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.695686 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.713048 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" event={"ID":"dee3a5db-edbc-4df1-8087-2c6630561bb2","Type":"ContainerStarted","Data":"6a6aaca10444de81ed64104bf8c16d872e36eb0e6fbb0a48917a05df9227ab2b"} Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.713189 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.716382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerStarted","Data":"fb1e21c83b08ed06ccfce738da4be2e7d6bd4aa275643fdfe6f2d80ee7a46f48"} Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.739664 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.739639283 podStartE2EDuration="5.739639283s" podCreationTimestamp="2026-03-17 01:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:18.726104861 +0000 UTC m=+1244.358337839" watchObservedRunningTime="2026-03-17 01:30:18.739639283 +0000 UTC m=+1244.371872271" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.752696 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" podStartSLOduration=5.752677752 podStartE2EDuration="5.752677752s" podCreationTimestamp="2026-03-17 01:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:18.74975817 +0000 UTC m=+1244.381991148" watchObservedRunningTime="2026-03-17 01:30:18.752677752 +0000 UTC m=+1244.384910730" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.758818 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 01:30:18 crc kubenswrapper[4735]: I0317 01:30:18.775172 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.299588206 podStartE2EDuration="5.775153732s" podCreationTimestamp="2026-03-17 01:30:13 +0000 UTC" firstStartedPulling="2026-03-17 01:30:16.514189909 +0000 UTC m=+1242.146422887" lastFinishedPulling="2026-03-17 01:30:16.989755435 +0000 UTC m=+1242.621988413" observedRunningTime="2026-03-17 01:30:18.770847637 +0000 UTC m=+1244.403080615" watchObservedRunningTime="2026-03-17 01:30:18.775153732 +0000 UTC m=+1244.407386710" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.099699 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.185760 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvghw\" (UniqueName: \"kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw\") pod \"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111\" (UID: \"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.200385 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw" (OuterVolumeSpecName: "kube-api-access-rvghw") pod "0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" (UID: "0b4f5e79-ad2d-4c96-8dae-7bd8360d2111"). InnerVolumeSpecName "kube-api-access-rvghw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.287226 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvghw\" (UniqueName: \"kubernetes.io/projected/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111-kube-api-access-rvghw\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.430351 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489279 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ng6v\" (UniqueName: \"kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489319 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489372 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.489443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs\") pod \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\" (UID: \"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa\") " Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.490075 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs" (OuterVolumeSpecName: "logs") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.490114 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.500706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v" (OuterVolumeSpecName: "kube-api-access-6ng6v") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "kube-api-access-6ng6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.509354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.518017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts" (OuterVolumeSpecName: "scripts") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.527865 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.555202 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data" (OuterVolumeSpecName: "config-data") pod "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" (UID: "0c3daa06-dcd5-4081-84cc-07c87a2bbfaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592036 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592066 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592078 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592087 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592096 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592105 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ng6v\" (UniqueName: \"kubernetes.io/projected/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-kube-api-access-6ng6v\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.592113 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.682502 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-5mvsn"] Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.691043 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-5mvsn"] Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724651 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerID="4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" exitCode=0 Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724679 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerID="d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" exitCode=143 Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerDied","Data":"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229"} Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerDied","Data":"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e"} Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0c3daa06-dcd5-4081-84cc-07c87a2bbfaa","Type":"ContainerDied","Data":"e1a455e99a01297cb63db057082c1338a6667aa79e56ed627dd5e6d7bf7b8a2d"} Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.724872 4735 scope.go:117] "RemoveContainer" containerID="4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.725813 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.728642 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.738476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-jpv8n" event={"ID":"0b4f5e79-ad2d-4c96-8dae-7bd8360d2111","Type":"ContainerDied","Data":"4d0001a3753137dbace70e98be9f4ca970e81fdc6b781dd624f98afdf9e6da39"} Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.738515 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0001a3753137dbace70e98be9f4ca970e81fdc6b781dd624f98afdf9e6da39" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.745821 4735 scope.go:117] "RemoveContainer" containerID="d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.762013 4735 scope.go:117] "RemoveContainer" containerID="4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.762390 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229\": container with ID starting with 4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229 not found: ID does not exist" containerID="4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.762421 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229"} err="failed to get container status \"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229\": rpc error: code = NotFound desc = could not find container \"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229\": container with ID starting with 4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229 not found: ID does not exist" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.762439 4735 scope.go:117] "RemoveContainer" containerID="d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.762805 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e\": container with ID starting with d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e not found: ID does not exist" containerID="d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.762826 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e"} err="failed to get container status \"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e\": rpc error: code = NotFound desc = could not find container \"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e\": container with ID starting with d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e not found: ID does not exist" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.762838 4735 scope.go:117] "RemoveContainer" containerID="4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.763095 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229"} err="failed to get container status \"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229\": rpc error: code = NotFound desc = could not find container \"4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229\": container with ID starting with 4efb8fb4ba25a2c49cec4f4b1673d4b59946bba52a9c0aae8ecb7ac17c0f6229 not found: ID does not exist" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.763111 4735 scope.go:117] "RemoveContainer" containerID="d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.763323 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e"} err="failed to get container status \"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e\": rpc error: code = NotFound desc = could not find container \"d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e\": container with ID starting with d97e5d6f4b5ab1fe6b61fbebe32dba64e9998255fd5ca0145e559ca4de91f65e not found: ID does not exist" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.766487 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.774203 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.801790 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802263 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802282 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-api" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802303 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802309 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802326 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" containerName="oc" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802332 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" containerName="oc" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802341 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802347 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802358 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802363 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802378 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802385 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802396 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="init" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802402 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="init" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802409 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-httpd" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802415 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-httpd" Mar 17 01:30:19 crc kubenswrapper[4735]: E0317 01:30:19.802423 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802428 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802596 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802610 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802619 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" containerName="cinder-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802625 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" containerName="oc" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802637 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7baa4934-72f6-4235-9168-f7dbc175b63d" containerName="neutron-httpd" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802645 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9753d2c8-2d80-4019-965a-af117590e79f" containerName="dnsmasq-dns" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802651 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.802662 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bd79d0-d430-4023-8625-41e29f4839a0" containerName="barbican-api-log" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.803583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.808117 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.808276 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.814984 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.834347 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898449 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898513 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e91a303-0695-4862-ad03-3c9828b5a3a5-logs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898541 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898579 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-scripts\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwf9s\" (UniqueName: \"kubernetes.io/projected/7e91a303-0695-4862-ad03-3c9828b5a3a5-kube-api-access-fwf9s\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:19 crc kubenswrapper[4735]: I0317 01:30:19.898634 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e91a303-0695-4862-ad03-3c9828b5a3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.000920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e91a303-0695-4862-ad03-3c9828b5a3a5-logs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.000968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.000995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-scripts\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwf9s\" (UniqueName: \"kubernetes.io/projected/7e91a303-0695-4862-ad03-3c9828b5a3a5-kube-api-access-fwf9s\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001074 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e91a303-0695-4862-ad03-3c9828b5a3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001160 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.001877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e91a303-0695-4862-ad03-3c9828b5a3a5-logs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.002464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e91a303-0695-4862-ad03-3c9828b5a3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.006137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.006369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-scripts\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.007485 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.008002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.008301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.013298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e91a303-0695-4862-ad03-3c9828b5a3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.019581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwf9s\" (UniqueName: \"kubernetes.io/projected/7e91a303-0695-4862-ad03-3c9828b5a3a5-kube-api-access-fwf9s\") pod \"cinder-api-0\" (UID: \"7e91a303-0695-4862-ad03-3c9828b5a3a5\") " pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.131848 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.618502 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.764437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e91a303-0695-4862-ad03-3c9828b5a3a5","Type":"ContainerStarted","Data":"63cc98438d011d77e92cb06c8066ac2fc49280210bba40d81f60ab2fddc6d4d8"} Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.772821 4735 generic.go:334] "Generic (PLEG): container finished" podID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerID="42e38c0b456e388752e43b9caa8cfb5117ac218cad2a6a4ed2ea4e36c016b6fa" exitCode=0 Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.773687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerDied","Data":"42e38c0b456e388752e43b9caa8cfb5117ac218cad2a6a4ed2ea4e36c016b6fa"} Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.827991 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.944781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8slq\" (UniqueName: \"kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.944824 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.944915 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.944967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.944983 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.945111 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.945163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd\") pod \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\" (UID: \"72dc2896-7cb6-4385-91a9-ae28c7f8907a\") " Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.946202 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.952797 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts" (OuterVolumeSpecName: "scripts") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.953058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.967429 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq" (OuterVolumeSpecName: "kube-api-access-h8slq") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "kube-api-access-h8slq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:20 crc kubenswrapper[4735]: I0317 01:30:20.983892 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.015173 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.023017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data" (OuterVolumeSpecName: "config-data") pod "72dc2896-7cb6-4385-91a9-ae28c7f8907a" (UID: "72dc2896-7cb6-4385-91a9-ae28c7f8907a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047040 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047301 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047311 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047324 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047334 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8slq\" (UniqueName: \"kubernetes.io/projected/72dc2896-7cb6-4385-91a9-ae28c7f8907a-kube-api-access-h8slq\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047343 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72dc2896-7cb6-4385-91a9-ae28c7f8907a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.047351 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72dc2896-7cb6-4385-91a9-ae28c7f8907a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.086780 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3daa06-dcd5-4081-84cc-07c87a2bbfaa" path="/var/lib/kubelet/pods/0c3daa06-dcd5-4081-84cc-07c87a2bbfaa/volumes" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.089941 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40cbfa1-7f4e-4e47-a8a6-444a4c087efd" path="/var/lib/kubelet/pods/c40cbfa1-7f4e-4e47-a8a6-444a4c087efd/volumes" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.790117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e91a303-0695-4862-ad03-3c9828b5a3a5","Type":"ContainerStarted","Data":"318f1d560351604fa42a42aa5c5bc883852eb4853738cd4c6ca4e9b9159d3281"} Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.792950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72dc2896-7cb6-4385-91a9-ae28c7f8907a","Type":"ContainerDied","Data":"9638b436c7a4e07c8e54ba234e95e3f3e554adf1202e4a3d969fbfe29eab847b"} Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.792979 4735 scope.go:117] "RemoveContainer" containerID="8f5f1b727af3536437684fa1cb307383b03ad29a3b82534376f9026226d02b68" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.793097 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.864917 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.874636 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.884277 4735 scope.go:117] "RemoveContainer" containerID="42e38c0b456e388752e43b9caa8cfb5117ac218cad2a6a4ed2ea4e36c016b6fa" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.937103 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:21 crc kubenswrapper[4735]: E0317 01:30:21.937693 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="sg-core" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.937707 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="sg-core" Mar 17 01:30:21 crc kubenswrapper[4735]: E0317 01:30:21.937724 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="ceilometer-notification-agent" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.937731 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="ceilometer-notification-agent" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.937916 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="sg-core" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.937943 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" containerName="ceilometer-notification-agent" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.939378 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.953999 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.954150 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:30:21 crc kubenswrapper[4735]: I0317 01:30:21.988105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.085837 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.085935 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.085992 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.086038 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.086073 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.086088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6ww\" (UniqueName: \"kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.086120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.195832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.195920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.195957 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.195975 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6ww\" (UniqueName: \"kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.196009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.196040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.196090 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.196535 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.196952 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.203615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.203929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.204590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.210538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.211234 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6ww\" (UniqueName: \"kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww\") pod \"ceilometer-0\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.215544 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.286305 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-9455b57b9-66mtr" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.299726 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.473652 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.600334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.779759 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.805449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerStarted","Data":"088b5f635c67627ff639f2af384840c0601ebf41dc2489f5de9b290f2bd9eb7e"} Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.806619 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e91a303-0695-4862-ad03-3c9828b5a3a5","Type":"ContainerStarted","Data":"40da655eec4364237cbf93034070615bd35592795acc56f3ac480b2164d30f46"} Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.809053 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.902562 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.902535958 podStartE2EDuration="3.902535958s" podCreationTimestamp="2026-03-17 01:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:22.851495828 +0000 UTC m=+1248.483728796" watchObservedRunningTime="2026-03-17 01:30:22.902535958 +0000 UTC m=+1248.534768936" Mar 17 01:30:22 crc kubenswrapper[4735]: I0317 01:30:22.954071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9b8f4bc48-rdtm4" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.091143 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72dc2896-7cb6-4385-91a9-ae28c7f8907a" path="/var/lib/kubelet/pods/72dc2896-7cb6-4385-91a9-ae28c7f8907a/volumes" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.387214 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.391512 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.393957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.396479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vvjl9" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.405714 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.420471 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.460153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.460244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxzsd\" (UniqueName: \"kubernetes.io/projected/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-kube-api-access-bxzsd\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.460273 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.460307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.561976 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxzsd\" (UniqueName: \"kubernetes.io/projected/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-kube-api-access-bxzsd\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.562436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.562555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.562727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.563437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.567950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.569349 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.580766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxzsd\" (UniqueName: \"kubernetes.io/projected/8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c-kube-api-access-bxzsd\") pod \"openstackclient\" (UID: \"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c\") " pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.720168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 01:30:23 crc kubenswrapper[4735]: I0317 01:30:23.859163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerStarted","Data":"c103170577fe4a9863aef3582a529d0a8802811946102c994ebca36ea21e535d"} Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.269995 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.290113 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 01:30:24 crc kubenswrapper[4735]: W0317 01:30:24.290826 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bc7db93_d3a0_4af3_b0e8_5cd148eaeb2c.slice/crio-32f18d397bf77164cb47ad5863a777546283eb683701ea5838a982fee65f17e1 WatchSource:0}: Error finding container 32f18d397bf77164cb47ad5863a777546283eb683701ea5838a982fee65f17e1: Status 404 returned error can't find the container with id 32f18d397bf77164cb47ad5863a777546283eb683701ea5838a982fee65f17e1 Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.336392 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.430575 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.430827 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="dnsmasq-dns" containerID="cri-o://0febb4cda137a9f64042f0ee793b17229c3087501a644197f2c50195f4802dc1" gracePeriod=10 Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.445530 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.897328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerStarted","Data":"08513105419d0794b474f0e1b3d37209bb8aa9050cb4b6196b78890f7b7104e4"} Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.899112 4735 generic.go:334] "Generic (PLEG): container finished" podID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerID="0febb4cda137a9f64042f0ee793b17229c3087501a644197f2c50195f4802dc1" exitCode=0 Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.899155 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" event={"ID":"860ecebd-d15d-483b-bfbd-8c5c80682d6f","Type":"ContainerDied","Data":"0febb4cda137a9f64042f0ee793b17229c3087501a644197f2c50195f4802dc1"} Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.900350 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="cinder-scheduler" containerID="cri-o://f5c9a06c68e6a6eb803cba13ca97574849ddbd30f734bf69e3fefc9042617f62" gracePeriod=30 Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.900584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c","Type":"ContainerStarted","Data":"32f18d397bf77164cb47ad5863a777546283eb683701ea5838a982fee65f17e1"} Mar 17 01:30:24 crc kubenswrapper[4735]: I0317 01:30:24.901645 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="probe" containerID="cri-o://fb1e21c83b08ed06ccfce738da4be2e7d6bd4aa275643fdfe6f2d80ee7a46f48" gracePeriod=30 Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.115202 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.314899 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lp56\" (UniqueName: \"kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.315004 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.315062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.315218 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.315278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.315304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb\") pod \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\" (UID: \"860ecebd-d15d-483b-bfbd-8c5c80682d6f\") " Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.361050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56" (OuterVolumeSpecName: "kube-api-access-8lp56") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "kube-api-access-8lp56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.420323 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lp56\" (UniqueName: \"kubernetes.io/projected/860ecebd-d15d-483b-bfbd-8c5c80682d6f-kube-api-access-8lp56\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.456453 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config" (OuterVolumeSpecName: "config") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.506332 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.519353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.522451 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.522480 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.522489 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.522920 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.554404 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "860ecebd-d15d-483b-bfbd-8c5c80682d6f" (UID: "860ecebd-d15d-483b-bfbd-8c5c80682d6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.623593 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.624116 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/860ecebd-d15d-483b-bfbd-8c5c80682d6f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.939204 4735 generic.go:334] "Generic (PLEG): container finished" podID="a9f6b305-68ba-47de-8770-6492fe920647" containerID="fb1e21c83b08ed06ccfce738da4be2e7d6bd4aa275643fdfe6f2d80ee7a46f48" exitCode=0 Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.939274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerDied","Data":"fb1e21c83b08ed06ccfce738da4be2e7d6bd4aa275643fdfe6f2d80ee7a46f48"} Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.953926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerStarted","Data":"a8ad0e32b377d13a77f249cde37ff356a4b57b06deabf7dee6bd1ab518148e4a"} Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.963681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" event={"ID":"860ecebd-d15d-483b-bfbd-8c5c80682d6f","Type":"ContainerDied","Data":"1f15b6059f4d58f462c12646f354432a5b315f5caa8880cd31cb91e83a7b0e28"} Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.963727 4735 scope.go:117] "RemoveContainer" containerID="0febb4cda137a9f64042f0ee793b17229c3087501a644197f2c50195f4802dc1" Mar 17 01:30:25 crc kubenswrapper[4735]: I0317 01:30:25.963895 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.025953 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.035782 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d7b898ff7-5l74x"] Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.066649 4735 scope.go:117] "RemoveContainer" containerID="1d80434aca65f06146d11ceea9a66dd01ec651bc1cc2840555b89f682b28bff7" Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.199213 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-76cdf95cd8-vx5pd" Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.290685 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.290910 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon-log" containerID="cri-o://88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9" gracePeriod=30 Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.291015 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" containerID="cri-o://67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5" gracePeriod=30 Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.305702 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.992632 4735 generic.go:334] "Generic (PLEG): container finished" podID="a9f6b305-68ba-47de-8770-6492fe920647" containerID="f5c9a06c68e6a6eb803cba13ca97574849ddbd30f734bf69e3fefc9042617f62" exitCode=0 Mar 17 01:30:26 crc kubenswrapper[4735]: I0317 01:30:26.992926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerDied","Data":"f5c9a06c68e6a6eb803cba13ca97574849ddbd30f734bf69e3fefc9042617f62"} Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.090016 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" path="/var/lib/kubelet/pods/860ecebd-d15d-483b-bfbd-8c5c80682d6f/volumes" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.231841 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.369517 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.369777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.369908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lk5q\" (UniqueName: \"kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.369992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.370016 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.370098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data\") pod \"a9f6b305-68ba-47de-8770-6492fe920647\" (UID: \"a9f6b305-68ba-47de-8770-6492fe920647\") " Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.369902 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.393560 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q" (OuterVolumeSpecName: "kube-api-access-7lk5q") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "kube-api-access-7lk5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.396490 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts" (OuterVolumeSpecName: "scripts") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.400097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.472447 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lk5q\" (UniqueName: \"kubernetes.io/projected/a9f6b305-68ba-47de-8770-6492fe920647-kube-api-access-7lk5q\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.472479 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.472488 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.472497 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f6b305-68ba-47de-8770-6492fe920647-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.511585 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.574097 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.595556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data" (OuterVolumeSpecName: "config-data") pod "a9f6b305-68ba-47de-8770-6492fe920647" (UID: "a9f6b305-68ba-47de-8770-6492fe920647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:27 crc kubenswrapper[4735]: I0317 01:30:27.675551 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f6b305-68ba-47de-8770-6492fe920647-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.014066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9f6b305-68ba-47de-8770-6492fe920647","Type":"ContainerDied","Data":"a7bea2599b06bd74003e27509f5f35aa0e1183cf5ecd4a95f61e2371882e0088"} Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.014301 4735 scope.go:117] "RemoveContainer" containerID="fb1e21c83b08ed06ccfce738da4be2e7d6bd4aa275643fdfe6f2d80ee7a46f48" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.014378 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.032356 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerStarted","Data":"a4a2469a21ca4659b0a2d58bffb535ca7d9475ef2bce7abb4791fdd7aae28a5f"} Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.033226 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.054348 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.058009 4735 scope.go:117] "RemoveContainer" containerID="f5c9a06c68e6a6eb803cba13ca97574849ddbd30f734bf69e3fefc9042617f62" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.065294 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.089313 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.581404216 podStartE2EDuration="7.08929679s" podCreationTimestamp="2026-03-17 01:30:21 +0000 UTC" firstStartedPulling="2026-03-17 01:30:22.753603181 +0000 UTC m=+1248.385836159" lastFinishedPulling="2026-03-17 01:30:27.261495755 +0000 UTC m=+1252.893728733" observedRunningTime="2026-03-17 01:30:28.082107504 +0000 UTC m=+1253.714340472" watchObservedRunningTime="2026-03-17 01:30:28.08929679 +0000 UTC m=+1253.721529768" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108180 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:28 crc kubenswrapper[4735]: E0317 01:30:28.108520 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="probe" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108537 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="probe" Mar 17 01:30:28 crc kubenswrapper[4735]: E0317 01:30:28.108552 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="cinder-scheduler" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108558 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="cinder-scheduler" Mar 17 01:30:28 crc kubenswrapper[4735]: E0317 01:30:28.108586 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="init" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="init" Mar 17 01:30:28 crc kubenswrapper[4735]: E0317 01:30:28.108604 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="dnsmasq-dns" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108610 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="dnsmasq-dns" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108761 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="cinder-scheduler" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108777 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f6b305-68ba-47de-8770-6492fe920647" containerName="probe" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.108788 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="dnsmasq-dns" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.109634 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.116809 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.128569 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.287126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97j28\" (UniqueName: \"kubernetes.io/projected/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-kube-api-access-97j28\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.287465 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.287637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.287755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.287899 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.288034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.395841 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.395922 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.395949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97j28\" (UniqueName: \"kubernetes.io/projected/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-kube-api-access-97j28\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.395992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.396065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.396084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.396345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.400277 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.400674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.400778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.402542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.413018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97j28\" (UniqueName: \"kubernetes.io/projected/28b4ea44-a1cd-4c67-935c-abdfa2ddb16a-kube-api-access-97j28\") pod \"cinder-scheduler-0\" (UID: \"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a\") " pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.466631 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 01:30:28 crc kubenswrapper[4735]: I0317 01:30:28.897343 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 01:30:28 crc kubenswrapper[4735]: W0317 01:30:28.912914 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b4ea44_a1cd_4c67_935c_abdfa2ddb16a.slice/crio-1ce50a46a8328179432a1d6527ae19617eacb3b803ce72e52281c3841f7ba598 WatchSource:0}: Error finding container 1ce50a46a8328179432a1d6527ae19617eacb3b803ce72e52281c3841f7ba598: Status 404 returned error can't find the container with id 1ce50a46a8328179432a1d6527ae19617eacb3b803ce72e52281c3841f7ba598 Mar 17 01:30:29 crc kubenswrapper[4735]: I0317 01:30:29.043152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a","Type":"ContainerStarted","Data":"1ce50a46a8328179432a1d6527ae19617eacb3b803ce72e52281c3841f7ba598"} Mar 17 01:30:29 crc kubenswrapper[4735]: I0317 01:30:29.082898 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f6b305-68ba-47de-8770-6492fe920647" path="/var/lib/kubelet/pods/a9f6b305-68ba-47de-8770-6492fe920647/volumes" Mar 17 01:30:29 crc kubenswrapper[4735]: I0317 01:30:29.828109 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7b898ff7-5l74x" podUID="860ecebd-d15d-483b-bfbd-8c5c80682d6f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.169:5353: i/o timeout" Mar 17 01:30:29 crc kubenswrapper[4735]: I0317 01:30:29.870078 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:56452->10.217.0.157:8443: read: connection reset by peer" Mar 17 01:30:29 crc kubenswrapper[4735]: I0317 01:30:29.870540 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 01:30:30 crc kubenswrapper[4735]: I0317 01:30:30.099702 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a","Type":"ContainerStarted","Data":"3a2e7cd45c05c560b02b0a11ff6004e89089424b085387b8ab7ed116c4d61110"} Mar 17 01:30:30 crc kubenswrapper[4735]: I0317 01:30:30.127813 4735 generic.go:334] "Generic (PLEG): container finished" podID="32c72925-26da-41c7-8279-8bc23ef68b62" containerID="67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5" exitCode=0 Mar 17 01:30:30 crc kubenswrapper[4735]: I0317 01:30:30.127868 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerDied","Data":"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5"} Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.136406 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28b4ea44-a1cd-4c67-935c-abdfa2ddb16a","Type":"ContainerStarted","Data":"0240f10401dee08946b7632ef7c5bb7e55f88aa7329e9193e190af7fe33d04fe"} Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.164373 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.164356242 podStartE2EDuration="3.164356242s" podCreationTimestamp="2026-03-17 01:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:31.16060701 +0000 UTC m=+1256.792839988" watchObservedRunningTime="2026-03-17 01:30:31.164356242 +0000 UTC m=+1256.796589220" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.946922 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.948327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.957032 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.957150 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vsg75" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.957232 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.979330 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.979368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.979449 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.979501 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvwt\" (UniqueName: \"kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:31 crc kubenswrapper[4735]: I0317 01:30:31.998914 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.081568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvwt\" (UniqueName: \"kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.081864 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.081890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.082035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.099752 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.107898 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.121496 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.152571 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.154610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.160923 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.168559 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvwt\" (UniqueName: \"kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt\") pod \"heat-engine-7c85bb6db7-nz59r\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.182083 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.183363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgxq\" (UniqueName: \"kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.183475 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.183597 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.183680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.183893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.219676 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.237528 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.272809 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.274646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.280176 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.280491 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgxq\" (UniqueName: \"kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284742 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284769 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284875 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284969 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbz7\" (UniqueName: \"kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.284985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84nxj\" (UniqueName: \"kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.285002 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.285029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.285062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.290577 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.294965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.326938 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.354542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgxq\" (UniqueName: \"kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.355072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle\") pod \"heat-cfnapi-67694b8cd8-nkbk6\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glbz7\" (UniqueName: \"kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84nxj\" (UniqueName: \"kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392698 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.392805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.393581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.395011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.397154 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.398606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.398652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.400738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.402841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.403772 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.419812 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84nxj\" (UniqueName: \"kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj\") pod \"heat-api-86957cdc-x94sr\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.433488 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbz7\" (UniqueName: \"kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7\") pod \"dnsmasq-dns-65c7b9b57c-2t9zn\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.567093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.572448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:32 crc kubenswrapper[4735]: I0317 01:30:32.689546 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.082570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.204942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7c85bb6db7-nz59r" event={"ID":"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25","Type":"ContainerStarted","Data":"4891417cfbd7e37b5ef44bc89c69c9a75bd835f06d210a3d974f5414c82292b8"} Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.341015 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.470953 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.615688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.772659 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.823385 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/horizon-76cdf95cd8-vx5pd" podUID="fc3f6d90-40e7-4962-b788-1e9924edb48f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.894621 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.993389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw422\" (UniqueName: \"kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422\") pod \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.993456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom\") pod \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.993529 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data\") pod \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.993595 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle\") pod \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.993672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs\") pod \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\" (UID: \"aedf685c-adde-4ccb-8a55-9c54e59ec1d2\") " Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.994312 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs" (OuterVolumeSpecName: "logs") pod "aedf685c-adde-4ccb-8a55-9c54e59ec1d2" (UID: "aedf685c-adde-4ccb-8a55-9c54e59ec1d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:33 crc kubenswrapper[4735]: I0317 01:30:33.997759 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422" (OuterVolumeSpecName: "kube-api-access-hw422") pod "aedf685c-adde-4ccb-8a55-9c54e59ec1d2" (UID: "aedf685c-adde-4ccb-8a55-9c54e59ec1d2"). InnerVolumeSpecName "kube-api-access-hw422". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.002960 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aedf685c-adde-4ccb-8a55-9c54e59ec1d2" (UID: "aedf685c-adde-4ccb-8a55-9c54e59ec1d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.042039 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aedf685c-adde-4ccb-8a55-9c54e59ec1d2" (UID: "aedf685c-adde-4ccb-8a55-9c54e59ec1d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.067243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data" (OuterVolumeSpecName: "config-data") pod "aedf685c-adde-4ccb-8a55-9c54e59ec1d2" (UID: "aedf685c-adde-4ccb-8a55-9c54e59ec1d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.095103 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.095134 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.095143 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw422\" (UniqueName: \"kubernetes.io/projected/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-kube-api-access-hw422\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.095152 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.095160 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedf685c-adde-4ccb-8a55-9c54e59ec1d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.145040 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="7e91a303-0695-4862-ad03-3c9828b5a3a5" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.179:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.252227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7c85bb6db7-nz59r" event={"ID":"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25","Type":"ContainerStarted","Data":"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.253322 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.260106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" event={"ID":"3aaa377c-bdde-46eb-89e4-361d0fa8cb36","Type":"ContainerStarted","Data":"f37b4c3fd9103c506c21dc17e3337cddb782b98deeeb1f718d0c2ccf78282fa0"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.267076 4735 generic.go:334] "Generic (PLEG): container finished" podID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerID="5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5" exitCode=137 Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.267139 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerDied","Data":"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.267167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8d99585df-2dfcf" event={"ID":"aedf685c-adde-4ccb-8a55-9c54e59ec1d2","Type":"ContainerDied","Data":"6f5661642ddb0de2d974842316d1dac4abe348c5cd5d1b2f0873af08e2ac6f98"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.267185 4735 scope.go:117] "RemoveContainer" containerID="5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.267304 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8d99585df-2dfcf" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.292327 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7c85bb6db7-nz59r" podStartSLOduration=3.2923076 podStartE2EDuration="3.2923076s" podCreationTimestamp="2026-03-17 01:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:34.28577225 +0000 UTC m=+1259.918005218" watchObservedRunningTime="2026-03-17 01:30:34.2923076 +0000 UTC m=+1259.924540578" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.299010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86957cdc-x94sr" event={"ID":"c50099a5-67b0-4c0b-be11-146f30190beb","Type":"ContainerStarted","Data":"5afb2720042c68df7313b2a94e9681761d4cd7a4a87610bbe852676cf3df3fab"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.323044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" event={"ID":"15817936-1648-4ce6-bfdd-c1cea98fe7e9","Type":"ContainerStarted","Data":"2fca2a2613f73abd56f97689015c3701ecb734859212f1d0dba3f16e1e6ff7da"} Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.336933 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.352761 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-8d99585df-2dfcf"] Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.368533 4735 scope.go:117] "RemoveContainer" containerID="7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.440512 4735 scope.go:117] "RemoveContainer" containerID="5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5" Mar 17 01:30:34 crc kubenswrapper[4735]: E0317 01:30:34.440959 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5\": container with ID starting with 5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5 not found: ID does not exist" containerID="5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.441000 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5"} err="failed to get container status \"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5\": rpc error: code = NotFound desc = could not find container \"5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5\": container with ID starting with 5d83a25b30d62038b2b8214c56ac4c7010b224c28cbbe7911f6f9b942dfc81d5 not found: ID does not exist" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.441024 4735 scope.go:117] "RemoveContainer" containerID="7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9" Mar 17 01:30:34 crc kubenswrapper[4735]: E0317 01:30:34.444630 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9\": container with ID starting with 7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9 not found: ID does not exist" containerID="7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9" Mar 17 01:30:34 crc kubenswrapper[4735]: I0317 01:30:34.444659 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9"} err="failed to get container status \"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9\": rpc error: code = NotFound desc = could not find container \"7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9\": container with ID starting with 7bddecac6593e16cdb6a87cfc373de9c3f03183d070b2ef2a70e1440999c8cc9 not found: ID does not exist" Mar 17 01:30:35 crc kubenswrapper[4735]: I0317 01:30:35.083211 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" path="/var/lib/kubelet/pods/aedf685c-adde-4ccb-8a55-9c54e59ec1d2/volumes" Mar 17 01:30:35 crc kubenswrapper[4735]: I0317 01:30:35.136278 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7e91a303-0695-4862-ad03-3c9828b5a3a5" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.179:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:30:35 crc kubenswrapper[4735]: I0317 01:30:35.407347 4735 generic.go:334] "Generic (PLEG): container finished" podID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerID="a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c" exitCode=0 Mar 17 01:30:35 crc kubenswrapper[4735]: I0317 01:30:35.408960 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" event={"ID":"15817936-1648-4ce6-bfdd-c1cea98fe7e9","Type":"ContainerDied","Data":"a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c"} Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.544498 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-544585c649-tthkx"] Mar 17 01:30:37 crc kubenswrapper[4735]: E0317 01:30:37.545714 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.545729 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker" Mar 17 01:30:37 crc kubenswrapper[4735]: E0317 01:30:37.545744 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker-log" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.545751 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker-log" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.545960 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker-log" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.545975 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedf685c-adde-4ccb-8a55-9c54e59ec1d2" containerName="barbican-worker" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.546838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.555325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.555706 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.565023 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.565791 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-544585c649-tthkx"] Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.721636 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.732281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-log-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.732535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-run-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.732664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmgz\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-kube-api-access-ppmgz\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.732820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-combined-ca-bundle\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.732969 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-config-data\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.733272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-public-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.733426 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-etc-swift\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.733557 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-internal-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.834987 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-internal-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.835690 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-log-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.835082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-log-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.836541 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-run-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.836916 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944ea7da-36e9-4cb9-a65c-ff8730df5107-run-httpd\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.836953 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmgz\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-kube-api-access-ppmgz\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.837035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-combined-ca-bundle\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.837101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-config-data\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.837147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-public-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.837223 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-etc-swift\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.843847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-combined-ca-bundle\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.844746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-internal-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.845664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-config-data\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.861166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ea7da-36e9-4cb9-a65c-ff8730df5107-public-tls-certs\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.862461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-etc-swift\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.864552 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmgz\" (UniqueName: \"kubernetes.io/projected/944ea7da-36e9-4cb9-a65c-ff8730df5107-kube-api-access-ppmgz\") pod \"swift-proxy-544585c649-tthkx\" (UID: \"944ea7da-36e9-4cb9-a65c-ff8730df5107\") " pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:37 crc kubenswrapper[4735]: I0317 01:30:37.872165 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:38 crc kubenswrapper[4735]: I0317 01:30:38.701478 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 01:30:38 crc kubenswrapper[4735]: I0317 01:30:38.802432 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 01:30:39 crc kubenswrapper[4735]: I0317 01:30:39.406827 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 01:30:39 crc kubenswrapper[4735]: I0317 01:30:39.478216 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:30:39 crc kubenswrapper[4735]: I0317 01:30:39.478713 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbbbfb4-v48lt" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-api" containerID="cri-o://6c22d0a33b7a7feb5c716d97ef6a490ca1f3f90d8d3e30b968cf89ada6c7f3db" gracePeriod=30 Mar 17 01:30:39 crc kubenswrapper[4735]: I0317 01:30:39.479235 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbbbfb4-v48lt" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-httpd" containerID="cri-o://8438c54bbe429594476cc802adbf16afabcc27cd0ebfba431d5a1cb790fc7682" gracePeriod=30 Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.465923 4735 generic.go:334] "Generic (PLEG): container finished" podID="55f65e0a-d223-4c97-8911-d903950feb61" containerID="8438c54bbe429594476cc802adbf16afabcc27cd0ebfba431d5a1cb790fc7682" exitCode=0 Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.466127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerDied","Data":"8438c54bbe429594476cc802adbf16afabcc27cd0ebfba431d5a1cb790fc7682"} Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.615212 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-55bb658757-9w8c7"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.616206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.628657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55bb658757-9w8c7"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.679063 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.680256 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.703828 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.709807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-combined-ca-bundle\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.710076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5cz\" (UniqueName: \"kubernetes.io/projected/4da6ec53-9801-4b92-b096-a55e63155103-kube-api-access-mr5cz\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.710208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data-custom\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.710333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.743538 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.745344 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.760276 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8hq\" (UniqueName: \"kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812248 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-combined-ca-bundle\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5cz\" (UniqueName: \"kubernetes.io/projected/4da6ec53-9801-4b92-b096-a55e63155103-kube-api-access-mr5cz\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812344 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data-custom\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.812386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.830470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data-custom\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.838955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-config-data\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.859199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da6ec53-9801-4b92-b096-a55e63155103-combined-ca-bundle\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.897658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5cz\" (UniqueName: \"kubernetes.io/projected/4da6ec53-9801-4b92-b096-a55e63155103-kube-api-access-mr5cz\") pod \"heat-engine-55bb658757-9w8c7\" (UID: \"4da6ec53-9801-4b92-b096-a55e63155103\") " pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916176 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8hq\" (UniqueName: \"kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916239 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdr28\" (UniqueName: \"kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916278 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.916318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.921216 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.922319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.937129 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:40 crc kubenswrapper[4735]: I0317 01:30:40.937562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.040648 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdr28\" (UniqueName: \"kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.040701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.040730 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.040763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.050269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8hq\" (UniqueName: \"kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq\") pod \"heat-cfnapi-778899646c-9ntf7\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.057608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.058923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.072645 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.081802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdr28\" (UniqueName: \"kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28\") pod \"heat-api-745b667557-vgprb\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.102059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.315248 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.633336 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.633642 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-notification-agent" containerID="cri-o://08513105419d0794b474f0e1b3d37209bb8aa9050cb4b6196b78890f7b7104e4" gracePeriod=30 Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.633676 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="sg-core" containerID="cri-o://a8ad0e32b377d13a77f249cde37ff356a4b57b06deabf7dee6bd1ab518148e4a" gracePeriod=30 Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.633761 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="proxy-httpd" containerID="cri-o://a4a2469a21ca4659b0a2d58bffb535ca7d9475ef2bce7abb4791fdd7aae28a5f" gracePeriod=30 Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.633583 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-central-agent" containerID="cri-o://c103170577fe4a9863aef3582a529d0a8802811946102c994ebca36ea21e535d" gracePeriod=30 Mar 17 01:30:41 crc kubenswrapper[4735]: I0317 01:30:41.650593 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": EOF" Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.380078 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.380513 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-log" containerID="cri-o://0add670582044a8cc9dbed9bb3d32df42740b89106b4b002c039ddf98a005495" gracePeriod=30 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.380933 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-httpd" containerID="cri-o://6e666d4e4c78c41a6d71e9a0fdf011dba9f4b17db66cfde061aff5aaf6438c05" gracePeriod=30 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.388361 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9292/healthcheck\": EOF" Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.388735 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.159:9292/healthcheck\": EOF" Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484410 4735 generic.go:334] "Generic (PLEG): container finished" podID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerID="a4a2469a21ca4659b0a2d58bffb535ca7d9475ef2bce7abb4791fdd7aae28a5f" exitCode=0 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484438 4735 generic.go:334] "Generic (PLEG): container finished" podID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerID="a8ad0e32b377d13a77f249cde37ff356a4b57b06deabf7dee6bd1ab518148e4a" exitCode=2 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484446 4735 generic.go:334] "Generic (PLEG): container finished" podID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerID="08513105419d0794b474f0e1b3d37209bb8aa9050cb4b6196b78890f7b7104e4" exitCode=0 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484454 4735 generic.go:334] "Generic (PLEG): container finished" podID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerID="c103170577fe4a9863aef3582a529d0a8802811946102c994ebca36ea21e535d" exitCode=0 Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484481 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerDied","Data":"a4a2469a21ca4659b0a2d58bffb535ca7d9475ef2bce7abb4791fdd7aae28a5f"} Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484532 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerDied","Data":"a8ad0e32b377d13a77f249cde37ff356a4b57b06deabf7dee6bd1ab518148e4a"} Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484542 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerDied","Data":"08513105419d0794b474f0e1b3d37209bb8aa9050cb4b6196b78890f7b7104e4"} Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.484551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerDied","Data":"c103170577fe4a9863aef3582a529d0a8802811946102c994ebca36ea21e535d"} Mar 17 01:30:42 crc kubenswrapper[4735]: I0317 01:30:42.554489 4735 scope.go:117] "RemoveContainer" containerID="356d6b57d9e05fdbec20214f9cf284404c8599bde4ae456a5a47df63588eba71" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.273620 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.303058 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6c568f7f98-2x6ww"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.304786 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.306997 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.323511 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.341917 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.350277 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c568f7f98-2x6ww"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqrt\" (UniqueName: \"kubernetes.io/projected/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-kube-api-access-4nqrt\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393442 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-combined-ca-bundle\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-internal-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393498 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-public-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393534 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data-custom\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.393590 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.400901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7fb7d7d79f-sbp8m"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.402192 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.404529 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.405088 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.423543 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7fb7d7d79f-sbp8m"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.496698 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmxn\" (UniqueName: \"kubernetes.io/projected/7534832e-8997-4cf3-8f8a-c8f91debac15-kube-api-access-8kmxn\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.496745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqrt\" (UniqueName: \"kubernetes.io/projected/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-kube-api-access-4nqrt\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.496925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-combined-ca-bundle\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.496965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-internal-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.496995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-internal-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-public-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497256 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data-custom\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-public-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data-custom\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497358 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-combined-ca-bundle\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.497377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.503886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data-custom\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.504514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-combined-ca-bundle\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.508144 4735 generic.go:334] "Generic (PLEG): container finished" podID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerID="0add670582044a8cc9dbed9bb3d32df42740b89106b4b002c039ddf98a005495" exitCode=143 Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.508184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerDied","Data":"0add670582044a8cc9dbed9bb3d32df42740b89106b4b002c039ddf98a005495"} Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.514068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-config-data\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.515547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqrt\" (UniqueName: \"kubernetes.io/projected/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-kube-api-access-4nqrt\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.515658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-internal-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.530577 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a8c1-5baf-48c7-886f-8701fa5ce663-public-tls-certs\") pod \"heat-api-6c568f7f98-2x6ww\" (UID: \"0ac8a8c1-5baf-48c7-886f-8701fa5ce663\") " pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-internal-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-public-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data-custom\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-combined-ca-bundle\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.599761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmxn\" (UniqueName: \"kubernetes.io/projected/7534832e-8997-4cf3-8f8a-c8f91debac15-kube-api-access-8kmxn\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.607004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-combined-ca-bundle\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.607311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-public-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.607746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-internal-tls-certs\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.608481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.609911 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7534832e-8997-4cf3-8f8a-c8f91debac15-config-data-custom\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.626285 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.635917 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmxn\" (UniqueName: \"kubernetes.io/projected/7534832e-8997-4cf3-8f8a-c8f91debac15-kube-api-access-8kmxn\") pod \"heat-cfnapi-7fb7d7d79f-sbp8m\" (UID: \"7534832e-8997-4cf3-8f8a-c8f91debac15\") " pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.737779 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.791426 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.791677 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-log" containerID="cri-o://f91aa2371c330a548923615f99c1fd2995b9c30c9ba7438cd3426ffed2aadb00" gracePeriod=30 Mar 17 01:30:43 crc kubenswrapper[4735]: I0317 01:30:43.792083 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-httpd" containerID="cri-o://28057c7a8bb9f4b30d111669f97c3397827c24b0312f1d99a5cd90d8a0a1ce5f" gracePeriod=30 Mar 17 01:30:44 crc kubenswrapper[4735]: I0317 01:30:44.517777 4735 generic.go:334] "Generic (PLEG): container finished" podID="37dddec5-15b5-4e45-880b-10225e30412c" containerID="f91aa2371c330a548923615f99c1fd2995b9c30c9ba7438cd3426ffed2aadb00" exitCode=143 Mar 17 01:30:44 crc kubenswrapper[4735]: I0317 01:30:44.518057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerDied","Data":"f91aa2371c330a548923615f99c1fd2995b9c30c9ba7438cd3426ffed2aadb00"} Mar 17 01:30:44 crc kubenswrapper[4735]: I0317 01:30:44.523024 4735 generic.go:334] "Generic (PLEG): container finished" podID="55f65e0a-d223-4c97-8911-d903950feb61" containerID="6c22d0a33b7a7feb5c716d97ef6a490ca1f3f90d8d3e30b968cf89ada6c7f3db" exitCode=0 Mar 17 01:30:44 crc kubenswrapper[4735]: I0317 01:30:44.523068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerDied","Data":"6c22d0a33b7a7feb5c716d97ef6a490ca1f3f90d8d3e30b968cf89ada6c7f3db"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.593751 4735 generic.go:334] "Generic (PLEG): container finished" podID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerID="6e666d4e4c78c41a6d71e9a0fdf011dba9f4b17db66cfde061aff5aaf6438c05" exitCode=0 Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.596699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerDied","Data":"6e666d4e4c78c41a6d71e9a0fdf011dba9f4b17db66cfde061aff5aaf6438c05"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.600962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" event={"ID":"15817936-1648-4ce6-bfdd-c1cea98fe7e9","Type":"ContainerStarted","Data":"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.601940 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.617706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d","Type":"ContainerDied","Data":"088b5f635c67627ff639f2af384840c0601ebf41dc2489f5de9b290f2bd9eb7e"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.617746 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="088b5f635c67627ff639f2af384840c0601ebf41dc2489f5de9b290f2bd9eb7e" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.619762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86957cdc-x94sr" event={"ID":"c50099a5-67b0-4c0b-be11-146f30190beb","Type":"ContainerStarted","Data":"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.619912 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-86957cdc-x94sr" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" containerName="heat-api" containerID="cri-o://d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23" gracePeriod=60 Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.620135 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.629336 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" podStartSLOduration=15.629313213 podStartE2EDuration="15.629313213s" podCreationTimestamp="2026-03-17 01:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:47.620380784 +0000 UTC m=+1273.252613762" watchObservedRunningTime="2026-03-17 01:30:47.629313213 +0000 UTC m=+1273.261546191" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.636600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbbbfb4-v48lt" event={"ID":"55f65e0a-d223-4c97-8911-d903950feb61","Type":"ContainerDied","Data":"43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.636630 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b75518a88aaf9aa00570d062d964a8c64fdbf4e4739393442ad397e82ae65c" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.644740 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-86957cdc-x94sr" podStartSLOduration=2.751782462 podStartE2EDuration="15.64471842s" podCreationTimestamp="2026-03-17 01:30:32 +0000 UTC" firstStartedPulling="2026-03-17 01:30:33.766374549 +0000 UTC m=+1259.398607527" lastFinishedPulling="2026-03-17 01:30:46.659310507 +0000 UTC m=+1272.291543485" observedRunningTime="2026-03-17 01:30:47.63570464 +0000 UTC m=+1273.267937618" watchObservedRunningTime="2026-03-17 01:30:47.64471842 +0000 UTC m=+1273.276951398" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.659518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c","Type":"ContainerStarted","Data":"6abaa994f9b7d99041bde99a31846fd6324b14920ef4b1758073c82d7b1733f7"} Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.678937 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.346827832 podStartE2EDuration="24.678918438s" podCreationTimestamp="2026-03-17 01:30:23 +0000 UTC" firstStartedPulling="2026-03-17 01:30:24.292709605 +0000 UTC m=+1249.924942583" lastFinishedPulling="2026-03-17 01:30:46.624800211 +0000 UTC m=+1272.257033189" observedRunningTime="2026-03-17 01:30:47.676698883 +0000 UTC m=+1273.308931861" watchObservedRunningTime="2026-03-17 01:30:47.678918438 +0000 UTC m=+1273.311151416" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.737966 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7fb7d7d79f-sbp8m"] Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.760143 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55bb658757-9w8c7"] Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.787899 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.858714 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.941924 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.941981 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b74t5\" (UniqueName: \"kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5\") pod \"55f65e0a-d223-4c97-8911-d903950feb61\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942018 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config\") pod \"55f65e0a-d223-4c97-8911-d903950feb61\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs\") pod \"55f65e0a-d223-4c97-8911-d903950feb61\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942170 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942346 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck6ww\" (UniqueName: \"kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942365 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts\") pod \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\" (UID: \"1bd85594-1041-42d3-b9f6-0b4ce4b46e3d\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle\") pod \"55f65e0a-d223-4c97-8911-d903950feb61\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.942419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config\") pod \"55f65e0a-d223-4c97-8911-d903950feb61\" (UID: \"55f65e0a-d223-4c97-8911-d903950feb61\") " Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.949883 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "55f65e0a-d223-4c97-8911-d903950feb61" (UID: "55f65e0a-d223-4c97-8911-d903950feb61"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.954913 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.955029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.956974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts" (OuterVolumeSpecName: "scripts") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.973990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5" (OuterVolumeSpecName: "kube-api-access-b74t5") pod "55f65e0a-d223-4c97-8911-d903950feb61" (UID: "55f65e0a-d223-4c97-8911-d903950feb61"). InnerVolumeSpecName "kube-api-access-b74t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.982466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww" (OuterVolumeSpecName: "kube-api-access-ck6ww") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "kube-api-access-ck6ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:47 crc kubenswrapper[4735]: I0317 01:30:47.986498 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.035415 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044575 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044602 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044611 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck6ww\" (UniqueName: \"kubernetes.io/projected/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-kube-api-access-ck6ww\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044620 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044628 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.044637 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b74t5\" (UniqueName: \"kubernetes.io/projected/55f65e0a-d223-4c97-8911-d903950feb61-kube-api-access-b74t5\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.141725 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.145478 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config" (OuterVolumeSpecName: "config") pod "55f65e0a-d223-4c97-8911-d903950feb61" (UID: "55f65e0a-d223-4c97-8911-d903950feb61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.148088 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.148112 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.152574 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c568f7f98-2x6ww"] Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.658964 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-674b457696-6r8nd" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.696503 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-544585c649-tthkx"] Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.698747 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" podUID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" containerName="heat-cfnapi" containerID="cri-o://739ceda1b92ef8d6499d25eadea73f9b1146c804e11422ba306cf12b1f6e4d79" gracePeriod=60 Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.698917 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" event={"ID":"3aaa377c-bdde-46eb-89e4-361d0fa8cb36","Type":"ContainerStarted","Data":"739ceda1b92ef8d6499d25eadea73f9b1146c804e11422ba306cf12b1f6e4d79"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.700051 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.709718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55bb658757-9w8c7" event={"ID":"4da6ec53-9801-4b92-b096-a55e63155103","Type":"ContainerStarted","Data":"3a7dc01e58cb7337e3e2c36566d7fdab06c5fb94ac26049f65ed086254ee8e18"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.709758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55bb658757-9w8c7" event={"ID":"4da6ec53-9801-4b92-b096-a55e63155103","Type":"ContainerStarted","Data":"f66d8cae6aeb6da08359969f2f5d6077c561a7e8da087b2ded5d11c9ec86b105"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.709885 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.712691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-745b667557-vgprb" event={"ID":"3a289e78-a2ab-4533-ab26-d2b51d915eb2","Type":"ContainerStarted","Data":"bb32de3403ddde0878a63cc909c651964d21b82a5c0b47806bd09b1fc33da447"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.714439 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" podStartSLOduration=3.378408439 podStartE2EDuration="16.714430279s" podCreationTimestamp="2026-03-17 01:30:32 +0000 UTC" firstStartedPulling="2026-03-17 01:30:33.430988275 +0000 UTC m=+1259.063221243" lastFinishedPulling="2026-03-17 01:30:46.767010115 +0000 UTC m=+1272.399243083" observedRunningTime="2026-03-17 01:30:48.71284613 +0000 UTC m=+1274.345079108" watchObservedRunningTime="2026-03-17 01:30:48.714430279 +0000 UTC m=+1274.346663257" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.736591 4735 generic.go:334] "Generic (PLEG): container finished" podID="37dddec5-15b5-4e45-880b-10225e30412c" containerID="28057c7a8bb9f4b30d111669f97c3397827c24b0312f1d99a5cd90d8a0a1ce5f" exitCode=0 Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.736847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerDied","Data":"28057c7a8bb9f4b30d111669f97c3397827c24b0312f1d99a5cd90d8a0a1ce5f"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.736941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37dddec5-15b5-4e45-880b-10225e30412c","Type":"ContainerDied","Data":"7119e80535f25cd52c1996b4ec6b2329d20e0b561b9c5b091b6ebf2f58f891ea"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.737054 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7119e80535f25cd52c1996b4ec6b2329d20e0b561b9c5b091b6ebf2f58f891ea" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.738818 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" event={"ID":"7534832e-8997-4cf3-8f8a-c8f91debac15","Type":"ContainerStarted","Data":"1ae190bf232eeb63b737496b7e02a31b7c41f82a7f51e27e960bf65d3e2e418b"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.740678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"762ac106-44fe-4a09-9dcf-a55d7c4573fe","Type":"ContainerDied","Data":"21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.740705 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ad39983cc00ae0c75f1ff290b9a3e8a0558db2efcba90250d71eedb4ce14de" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.741497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-778899646c-9ntf7" event={"ID":"8b38fe4e-49b4-40e8-88bb-150a1b13d936","Type":"ContainerStarted","Data":"5ef36ba3c9f126694f9c7e3a5f1d9ef84b2a6c3ab2856ca8346c44147347bb6a"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.742264 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.742620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c568f7f98-2x6ww" event={"ID":"0ac8a8c1-5baf-48c7-886f-8701fa5ce663","Type":"ContainerStarted","Data":"e5bdfd682869f4f562a8717078f23f905c579a2a83677a19a6dedada55e1828f"} Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.742797 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.754400 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-55bb658757-9w8c7" podStartSLOduration=8.754378378 podStartE2EDuration="8.754378378s" podCreationTimestamp="2026-03-17 01:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:48.752497672 +0000 UTC m=+1274.384730660" watchObservedRunningTime="2026-03-17 01:30:48.754378378 +0000 UTC m=+1274.386611356" Mar 17 01:30:48 crc kubenswrapper[4735]: W0317 01:30:48.759928 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod944ea7da_36e9_4cb9_a65c_ff8730df5107.slice/crio-2e2390156d45cfde6522c11ef94e773942d679a0afaa01416deae21730066583 WatchSource:0}: Error finding container 2e2390156d45cfde6522c11ef94e773942d679a0afaa01416deae21730066583: Status 404 returned error can't find the container with id 2e2390156d45cfde6522c11ef94e773942d679a0afaa01416deae21730066583 Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.985728 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55f65e0a-d223-4c97-8911-d903950feb61" (UID: "55f65e0a-d223-4c97-8911-d903950feb61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:48 crc kubenswrapper[4735]: I0317 01:30:48.990403 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "55f65e0a-d223-4c97-8911-d903950feb61" (UID: "55f65e0a-d223-4c97-8911-d903950feb61"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.005152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.075361 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.075584 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.075641 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f65e0a-d223-4c97-8911-d903950feb61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.083236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data" (OuterVolumeSpecName: "config-data") pod "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" (UID: "1bd85594-1041-42d3-b9f6-0b4ce4b46e3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.177328 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.283264 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lfv7\" (UniqueName: \"kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393510 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393642 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.393748 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data\") pod \"37dddec5-15b5-4e45-880b-10225e30412c\" (UID: \"37dddec5-15b5-4e45-880b-10225e30412c\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.394970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.398061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs" (OuterVolumeSpecName: "logs") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.418274 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7" (OuterVolumeSpecName: "kube-api-access-5lfv7") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "kube-api-access-5lfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.420208 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts" (OuterVolumeSpecName: "scripts") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.470984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.475146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495317 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lfv7\" (UniqueName: \"kubernetes.io/projected/37dddec5-15b5-4e45-880b-10225e30412c-kube-api-access-5lfv7\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495353 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495363 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495371 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495380 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.495387 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dddec5-15b5-4e45-880b-10225e30412c-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.516202 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.538149 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.550592 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data" (OuterVolumeSpecName: "config-data") pod "37dddec5-15b5-4e45-880b-10225e30412c" (UID: "37dddec5-15b5-4e45-880b-10225e30412c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.597294 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.597322 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.597333 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dddec5-15b5-4e45-880b-10225e30412c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.664415 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.764028 4735 generic.go:334] "Generic (PLEG): container finished" podID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" containerID="739ceda1b92ef8d6499d25eadea73f9b1146c804e11422ba306cf12b1f6e4d79" exitCode=0 Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.764102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" event={"ID":"3aaa377c-bdde-46eb-89e4-361d0fa8cb36","Type":"ContainerDied","Data":"739ceda1b92ef8d6499d25eadea73f9b1146c804e11422ba306cf12b1f6e4d79"} Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.766901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544585c649-tthkx" event={"ID":"944ea7da-36e9-4cb9-a65c-ff8730df5107","Type":"ContainerStarted","Data":"2e2390156d45cfde6522c11ef94e773942d679a0afaa01416deae21730066583"} Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.769622 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.772984 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.778743 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.779454 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" event={"ID":"7534832e-8997-4cf3-8f8a-c8f91debac15","Type":"ContainerStarted","Data":"403c52bca0b389003add54071a4b17f092bc1f1b16bceb42c025d1cc9294d1de"} Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.779476 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.783810 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794175 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794513 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794539 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794549 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-notification-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794557 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-notification-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794865 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794877 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794892 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794898 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794907 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-api" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794913 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-api" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794925 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="sg-core" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794930 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="sg-core" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794941 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-central-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794946 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-central-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794956 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="proxy-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794969 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="proxy-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794980 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.794985 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.794999 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795005 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795163 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795181 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-central-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795190 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-api" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795199 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="proxy-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795209 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dddec5-15b5-4e45-880b-10225e30412c" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795218 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="ceilometer-notification-agent" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795227 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-log" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795238 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" containerName="sg-core" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795246 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" containerName="glance-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.795254 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f65e0a-d223-4c97-8911-d903950feb61" containerName="neutron-httpd" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.799643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801591 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801699 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801770 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6knf\" (UniqueName: \"kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.801994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.802025 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.802312 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.802630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.803713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts\") pod \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\" (UID: \"762ac106-44fe-4a09-9dcf-a55d7c4573fe\") " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.824605 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.840567 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.858641 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf" (OuterVolumeSpecName: "kube-api-access-v6knf") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "kube-api-access-v6knf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.859119 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts" (OuterVolumeSpecName: "scripts") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.859218 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs" (OuterVolumeSpecName: "logs") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.859421 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.878745 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" podStartSLOduration=6.878727095 podStartE2EDuration="6.878727095s" podCreationTimestamp="2026-03-17 01:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:49.813872537 +0000 UTC m=+1275.446105515" watchObservedRunningTime="2026-03-17 01:30:49.878727095 +0000 UTC m=+1275.510960073" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.906725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.906791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.907185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.907659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.907830 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.907836 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908241 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vbz\" (UniqueName: \"kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908321 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908338 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908347 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762ac106-44fe-4a09-9dcf-a55d7c4573fe-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908356 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.908365 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6knf\" (UniqueName: \"kubernetes.io/projected/762ac106-44fe-4a09-9dcf-a55d7c4573fe-kube-api-access-v6knf\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.928944 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.939885 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.998248 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:49 crc kubenswrapper[4735]: E0317 01:30:49.998698 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" containerName="heat-cfnapi" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.998715 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" containerName="heat-cfnapi" Mar 17 01:30:49 crc kubenswrapper[4735]: I0317 01:30:49.998971 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" containerName="heat-cfnapi" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.000051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.005522 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.005691 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.009999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjgxq\" (UniqueName: \"kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq\") pod \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom\") pod \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data\") pod \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle\") pod \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\" (UID: \"3aaa377c-bdde-46eb-89e4-361d0fa8cb36\") " Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010827 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vbz\" (UniqueName: \"kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.010982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.011016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.011046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.011501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.012658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.034828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.039166 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.166303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.167562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.167786 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vbz\" (UniqueName: \"kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.168095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts\") pod \"ceilometer-0\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.174804 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq" (OuterVolumeSpecName: "kube-api-access-vjgxq") pod "3aaa377c-bdde-46eb-89e4-361d0fa8cb36" (UID: "3aaa377c-bdde-46eb-89e4-361d0fa8cb36"). InnerVolumeSpecName "kube-api-access-vjgxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.196103 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3aaa377c-bdde-46eb-89e4-361d0fa8cb36" (UID: "3aaa377c-bdde-46eb-89e4-361d0fa8cb36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.255519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfnmx\" (UniqueName: \"kubernetes.io/projected/e8f3c02b-c96f-469b-86df-21a5342bca54-kube-api-access-gfnmx\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.256071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.256282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.256559 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.256708 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.256917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.258962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.259444 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.262514 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjgxq\" (UniqueName: \"kubernetes.io/projected/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-kube-api-access-vjgxq\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.262646 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.288116 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.351998 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.359356 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data" (OuterVolumeSpecName: "config-data") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364248 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfnmx\" (UniqueName: \"kubernetes.io/projected/e8f3c02b-c96f-469b-86df-21a5342bca54-kube-api-access-gfnmx\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364417 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364477 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364498 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364566 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364580 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364593 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.364934 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.368038 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.370191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3c02b-c96f-469b-86df-21a5342bca54-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.373680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.388372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.388803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.393368 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3c02b-c96f-469b-86df-21a5342bca54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.399489 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfnmx\" (UniqueName: \"kubernetes.io/projected/e8f3c02b-c96f-469b-86df-21a5342bca54-kube-api-access-gfnmx\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.406138 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aaa377c-bdde-46eb-89e4-361d0fa8cb36" (UID: "3aaa377c-bdde-46eb-89e4-361d0fa8cb36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.441282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.461176 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8f3c02b-c96f-469b-86df-21a5342bca54\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.471216 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.520993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "762ac106-44fe-4a09-9dcf-a55d7c4573fe" (UID: "762ac106-44fe-4a09-9dcf-a55d7c4573fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.565577 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data" (OuterVolumeSpecName: "config-data") pod "3aaa377c-bdde-46eb-89e4-361d0fa8cb36" (UID: "3aaa377c-bdde-46eb-89e4-361d0fa8cb36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.573033 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/762ac106-44fe-4a09-9dcf-a55d7c4573fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.573682 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aaa377c-bdde-46eb-89e4-361d0fa8cb36-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.634034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.835128 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c568f7f98-2x6ww" event={"ID":"0ac8a8c1-5baf-48c7-886f-8701fa5ce663","Type":"ContainerStarted","Data":"78ee8a33f5b71fe5bf4061df634c3df7e563d7455011ae1387aa8ad0cb795257"} Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.837270 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.838096 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.840014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" event={"ID":"3aaa377c-bdde-46eb-89e4-361d0fa8cb36","Type":"ContainerDied","Data":"f37b4c3fd9103c506c21dc17e3337cddb782b98deeeb1f718d0c2ccf78282fa0"} Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.840046 4735 scope.go:117] "RemoveContainer" containerID="739ceda1b92ef8d6499d25eadea73f9b1146c804e11422ba306cf12b1f6e4d79" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.840146 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67694b8cd8-nkbk6" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.866309 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.873116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-745b667557-vgprb" event={"ID":"3a289e78-a2ab-4533-ab26-d2b51d915eb2","Type":"ContainerStarted","Data":"a5496706bfadbb3fb343ab868bea847c1588a70d927e01c85901a03f6103a8ad"} Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.873319 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.883517 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544585c649-tthkx" event={"ID":"944ea7da-36e9-4cb9-a65c-ff8730df5107","Type":"ContainerStarted","Data":"8b6a15a84a60fb9e006b78f1f2ccb1745a1dd37954caa9738fdba61e8c7e56f6"} Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.892145 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.893914 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.899204 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.899424 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.906255 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-778899646c-9ntf7" event={"ID":"8b38fe4e-49b4-40e8-88bb-150a1b13d936","Type":"ContainerStarted","Data":"72cbe0872f77e5a69e341654927cd93fac5c4e933ddd7caf0613a95e66dfd03d"} Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.906291 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.914907 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.916164 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6c568f7f98-2x6ww" podStartSLOduration=7.916142853 podStartE2EDuration="7.916142853s" podCreationTimestamp="2026-03-17 01:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:50.872898893 +0000 UTC m=+1276.505131871" watchObservedRunningTime="2026-03-17 01:30:50.916142853 +0000 UTC m=+1276.548375831" Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.968186 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.981099 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-67694b8cd8-nkbk6"] Mar 17 01:30:50 crc kubenswrapper[4735]: I0317 01:30:50.986167 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-745b667557-vgprb" podStartSLOduration=10.986156017 podStartE2EDuration="10.986156017s" podCreationTimestamp="2026-03-17 01:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:50.91766269 +0000 UTC m=+1276.549895668" watchObservedRunningTime="2026-03-17 01:30:50.986156017 +0000 UTC m=+1276.618388985" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.020927 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-778899646c-9ntf7" podStartSLOduration=11.020911019 podStartE2EDuration="11.020911019s" podCreationTimestamp="2026-03-17 01:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:50.977472975 +0000 UTC m=+1276.609705953" watchObservedRunningTime="2026-03-17 01:30:51.020911019 +0000 UTC m=+1276.653143997" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.091414 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd85594-1041-42d3-b9f6-0b4ce4b46e3d" path="/var/lib/kubelet/pods/1bd85594-1041-42d3-b9f6-0b4ce4b46e3d/volumes" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.092176 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dddec5-15b5-4e45-880b-10225e30412c" path="/var/lib/kubelet/pods/37dddec5-15b5-4e45-880b-10225e30412c/volumes" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093370 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aaa377c-bdde-46eb-89e4-361d0fa8cb36" path="/var/lib/kubelet/pods/3aaa377c-bdde-46eb-89e4-361d0fa8cb36/volumes" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093795 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093965 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762ac106-44fe-4a09-9dcf-a55d7c4573fe" path="/var/lib/kubelet/pods/762ac106-44fe-4a09-9dcf-a55d7c4573fe/volumes" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093980 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.093998 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqxf\" (UniqueName: \"kubernetes.io/projected/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-kube-api-access-kkqxf\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.094032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.094084 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.094132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-logs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.094175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.140688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.195773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196193 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196344 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqxf\" (UniqueName: \"kubernetes.io/projected/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-kube-api-access-kkqxf\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196650 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-logs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.196761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.197949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.198162 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.198805 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-logs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.202475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.206049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.208834 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.222820 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.233495 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqxf\" (UniqueName: \"kubernetes.io/projected/fc51e2e8-eeb0-49e2-8893-f2d1aca3be95-kube-api-access-kkqxf\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.285903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95\") " pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.395848 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:30:51 crc kubenswrapper[4735]: W0317 01:30:51.404251 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8f3c02b_c96f_469b_86df_21a5342bca54.slice/crio-6dec45426d494aa4624ad1c7b44e812cc5576e8df53e4c9885e13fd1038ead21 WatchSource:0}: Error finding container 6dec45426d494aa4624ad1c7b44e812cc5576e8df53e4c9885e13fd1038ead21: Status 404 returned error can't find the container with id 6dec45426d494aa4624ad1c7b44e812cc5576e8df53e4c9885e13fd1038ead21 Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.550184 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.933128 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerID="a5496706bfadbb3fb343ab868bea847c1588a70d927e01c85901a03f6103a8ad" exitCode=1 Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.933227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-745b667557-vgprb" event={"ID":"3a289e78-a2ab-4533-ab26-d2b51d915eb2","Type":"ContainerDied","Data":"a5496706bfadbb3fb343ab868bea847c1588a70d927e01c85901a03f6103a8ad"} Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.933774 4735 scope.go:117] "RemoveContainer" containerID="a5496706bfadbb3fb343ab868bea847c1588a70d927e01c85901a03f6103a8ad" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.946524 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544585c649-tthkx" event={"ID":"944ea7da-36e9-4cb9-a65c-ff8730df5107","Type":"ContainerStarted","Data":"8f89c6511a276c9c0d5eecd513787e2b84202fe05061d97901ab90f98e086449"} Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.947099 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.947128 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.994159 4735 generic.go:334] "Generic (PLEG): container finished" podID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerID="72cbe0872f77e5a69e341654927cd93fac5c4e933ddd7caf0613a95e66dfd03d" exitCode=1 Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.994269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-778899646c-9ntf7" event={"ID":"8b38fe4e-49b4-40e8-88bb-150a1b13d936","Type":"ContainerDied","Data":"72cbe0872f77e5a69e341654927cd93fac5c4e933ddd7caf0613a95e66dfd03d"} Mar 17 01:30:51 crc kubenswrapper[4735]: I0317 01:30:51.995200 4735 scope.go:117] "RemoveContainer" containerID="72cbe0872f77e5a69e341654927cd93fac5c4e933ddd7caf0613a95e66dfd03d" Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.081738 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-544585c649-tthkx" podStartSLOduration=15.081720909 podStartE2EDuration="15.081720909s" podCreationTimestamp="2026-03-17 01:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:52.004079278 +0000 UTC m=+1277.636312256" watchObservedRunningTime="2026-03-17 01:30:52.081720909 +0000 UTC m=+1277.713953877" Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.094081 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerStarted","Data":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.094126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerStarted","Data":"79be2a7555dd215e2d45ec1bbafe88a1817685e5327172f174dd45da6d1c738d"} Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.110200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3c02b-c96f-469b-86df-21a5342bca54","Type":"ContainerStarted","Data":"6dec45426d494aa4624ad1c7b44e812cc5576e8df53e4c9885e13fd1038ead21"} Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.240475 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.385702 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.574011 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.659309 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:52 crc kubenswrapper[4735]: I0317 01:30:52.659532 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="dnsmasq-dns" containerID="cri-o://6a6aaca10444de81ed64104bf8c16d872e36eb0e6fbb0a48917a05df9227ab2b" gracePeriod=10 Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.219571 4735 generic.go:334] "Generic (PLEG): container finished" podID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" exitCode=1 Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.219758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-778899646c-9ntf7" event={"ID":"8b38fe4e-49b4-40e8-88bb-150a1b13d936","Type":"ContainerDied","Data":"52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.219915 4735 scope.go:117] "RemoveContainer" containerID="72cbe0872f77e5a69e341654927cd93fac5c4e933ddd7caf0613a95e66dfd03d" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.220311 4735 scope.go:117] "RemoveContainer" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" Mar 17 01:30:53 crc kubenswrapper[4735]: E0317 01:30:53.220530 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-778899646c-9ntf7_openstack(8b38fe4e-49b4-40e8-88bb-150a1b13d936)\"" pod="openstack/heat-cfnapi-778899646c-9ntf7" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.228666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerStarted","Data":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.248574 4735 generic.go:334] "Generic (PLEG): container finished" podID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerID="6a6aaca10444de81ed64104bf8c16d872e36eb0e6fbb0a48917a05df9227ab2b" exitCode=0 Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.248656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" event={"ID":"dee3a5db-edbc-4df1-8087-2c6630561bb2","Type":"ContainerDied","Data":"6a6aaca10444de81ed64104bf8c16d872e36eb0e6fbb0a48917a05df9227ab2b"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.260173 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95","Type":"ContainerStarted","Data":"8ff9c3e9af5ca8de528a260c1e75b76b4db7770df31255752252d3f213b0e43d"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.261605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3c02b-c96f-469b-86df-21a5342bca54","Type":"ContainerStarted","Data":"166db6c4b5194cf6af452474fe125438feffb64bc6f8c96084c765571b14b148"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.263640 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" exitCode=1 Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.264997 4735 scope.go:117] "RemoveContainer" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" Mar 17 01:30:53 crc kubenswrapper[4735]: E0317 01:30:53.265570 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-745b667557-vgprb_openstack(3a289e78-a2ab-4533-ab26-d2b51d915eb2)\"" pod="openstack/heat-api-745b667557-vgprb" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.265772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-745b667557-vgprb" event={"ID":"3a289e78-a2ab-4533-ab26-d2b51d915eb2","Type":"ContainerDied","Data":"b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325"} Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.295265 4735 scope.go:117] "RemoveContainer" containerID="a5496706bfadbb3fb343ab868bea847c1588a70d927e01c85901a03f6103a8ad" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.547300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583167 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583273 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583301 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfm7x\" (UniqueName: \"kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.583391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0\") pod \"dee3a5db-edbc-4df1-8087-2c6630561bb2\" (UID: \"dee3a5db-edbc-4df1-8087-2c6630561bb2\") " Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.620092 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x" (OuterVolumeSpecName: "kube-api-access-bfm7x") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "kube-api-access-bfm7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.665190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.685362 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.685396 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfm7x\" (UniqueName: \"kubernetes.io/projected/dee3a5db-edbc-4df1-8087-2c6630561bb2-kube-api-access-bfm7x\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.695599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.700566 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.727178 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config" (OuterVolumeSpecName: "config") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.741191 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dee3a5db-edbc-4df1-8087-2c6630561bb2" (UID: "dee3a5db-edbc-4df1-8087-2c6630561bb2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.786941 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.786973 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.786983 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:53 crc kubenswrapper[4735]: I0317 01:30:53.786995 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dee3a5db-edbc-4df1-8087-2c6630561bb2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.278813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerStarted","Data":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.281896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" event={"ID":"dee3a5db-edbc-4df1-8087-2c6630561bb2","Type":"ContainerDied","Data":"b37b408d512ebb85e11244d858e460e27546a0b4543577f1425b52ff33bb9544"} Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.282068 4735 scope.go:117] "RemoveContainer" containerID="6a6aaca10444de81ed64104bf8c16d872e36eb0e6fbb0a48917a05df9227ab2b" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.281909 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd446449-s7v5p" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.294947 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95","Type":"ContainerStarted","Data":"d6c21a5602df57ccf50d066cea077a7655003076f8f5e425955802d7fb5c555a"} Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.307490 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3c02b-c96f-469b-86df-21a5342bca54","Type":"ContainerStarted","Data":"0674b24bf9b046295340f01562373d8bc30837baea8362d91f68b4266bae6f57"} Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.325598 4735 scope.go:117] "RemoveContainer" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" Mar 17 01:30:54 crc kubenswrapper[4735]: E0317 01:30:54.325801 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-745b667557-vgprb_openstack(3a289e78-a2ab-4533-ab26-d2b51d915eb2)\"" pod="openstack/heat-api-745b667557-vgprb" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.327758 4735 scope.go:117] "RemoveContainer" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" Mar 17 01:30:54 crc kubenswrapper[4735]: E0317 01:30:54.330324 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-778899646c-9ntf7_openstack(8b38fe4e-49b4-40e8-88bb-150a1b13d936)\"" pod="openstack/heat-cfnapi-778899646c-9ntf7" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.358276 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.374939 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bd446449-s7v5p"] Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.424589 4735 scope.go:117] "RemoveContainer" containerID="89ee5dabb834341edc8d7ed850ecb2e4e4d3b107e11515ad9885ab76894572e8" Mar 17 01:30:54 crc kubenswrapper[4735]: I0317 01:30:54.444270 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.444251091 podStartE2EDuration="5.444251091s" podCreationTimestamp="2026-03-17 01:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:54.39032858 +0000 UTC m=+1280.022561558" watchObservedRunningTime="2026-03-17 01:30:54.444251091 +0000 UTC m=+1280.076484069" Mar 17 01:30:55 crc kubenswrapper[4735]: I0317 01:30:55.089648 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" path="/var/lib/kubelet/pods/dee3a5db-edbc-4df1-8087-2c6630561bb2/volumes" Mar 17 01:30:55 crc kubenswrapper[4735]: I0317 01:30:55.114484 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:55 crc kubenswrapper[4735]: I0317 01:30:55.352029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc51e2e8-eeb0-49e2-8893-f2d1aca3be95","Type":"ContainerStarted","Data":"9eceb379f5050328086b0c1bfadfa7e089d43010af4d807788e0c1f73f2e797f"} Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.103255 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.103935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.104582 4735 scope.go:117] "RemoveContainer" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" Mar 17 01:30:56 crc kubenswrapper[4735]: E0317 01:30:56.104829 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-745b667557-vgprb_openstack(3a289e78-a2ab-4533-ab26-d2b51d915eb2)\"" pod="openstack/heat-api-745b667557-vgprb" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.110263 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.128417 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.128399889 podStartE2EDuration="6.128399889s" podCreationTimestamp="2026-03-17 01:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:30:55.383979376 +0000 UTC m=+1281.016212354" watchObservedRunningTime="2026-03-17 01:30:56.128399889 +0000 UTC m=+1281.760632867" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.317977 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.318889 4735 scope.go:117] "RemoveContainer" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" Mar 17 01:30:56 crc kubenswrapper[4735]: E0317 01:30:56.319112 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-778899646c-9ntf7_openstack(8b38fe4e-49b4-40e8-88bb-150a1b13d936)\"" pod="openstack/heat-cfnapi-778899646c-9ntf7" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.319403 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.376904 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerStarted","Data":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.377304 4735 scope.go:117] "RemoveContainer" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" Mar 17 01:30:56 crc kubenswrapper[4735]: E0317 01:30:56.377514 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-778899646c-9ntf7_openstack(8b38fe4e-49b4-40e8-88bb-150a1b13d936)\"" pod="openstack/heat-cfnapi-778899646c-9ntf7" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.378186 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.378507 4735 scope.go:117] "RemoveContainer" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" Mar 17 01:30:56 crc kubenswrapper[4735]: E0317 01:30:56.378693 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-745b667557-vgprb_openstack(3a289e78-a2ab-4533-ab26-d2b51d915eb2)\"" pod="openstack/heat-api-745b667557-vgprb" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.419739 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.185199513 podStartE2EDuration="7.419720623s" podCreationTimestamp="2026-03-17 01:30:49 +0000 UTC" firstStartedPulling="2026-03-17 01:30:51.16390028 +0000 UTC m=+1276.796133258" lastFinishedPulling="2026-03-17 01:30:55.39842139 +0000 UTC m=+1281.030654368" observedRunningTime="2026-03-17 01:30:56.413391068 +0000 UTC m=+1282.045624046" watchObservedRunningTime="2026-03-17 01:30:56.419720623 +0000 UTC m=+1282.051953601" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.756320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7fb7d7d79f-sbp8m" Mar 17 01:30:56 crc kubenswrapper[4735]: I0317 01:30:56.822371 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.170082 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7jpx\" (UniqueName: \"kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268372 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268572 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268711 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.268733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle\") pod \"32c72925-26da-41c7-8279-8bc23ef68b62\" (UID: \"32c72925-26da-41c7-8279-8bc23ef68b62\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.269390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs" (OuterVolumeSpecName: "logs") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.281034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.297059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx" (OuterVolumeSpecName: "kube-api-access-s7jpx") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "kube-api-access-s7jpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.307381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts" (OuterVolumeSpecName: "scripts") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.341842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data" (OuterVolumeSpecName: "config-data") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.346878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.374248 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c72925-26da-41c7-8279-8bc23ef68b62-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.380624 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.381716 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.381802 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7jpx\" (UniqueName: \"kubernetes.io/projected/32c72925-26da-41c7-8279-8bc23ef68b62-kube-api-access-s7jpx\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.382152 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.382317 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32c72925-26da-41c7-8279-8bc23ef68b62-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.379164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "32c72925-26da-41c7-8279-8bc23ef68b62" (UID: "32c72925-26da-41c7-8279-8bc23ef68b62"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.407262 4735 generic.go:334] "Generic (PLEG): container finished" podID="32c72925-26da-41c7-8279-8bc23ef68b62" containerID="88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9" exitCode=137 Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.408195 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b457696-6r8nd" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.409964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerDied","Data":"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9"} Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.409991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b457696-6r8nd" event={"ID":"32c72925-26da-41c7-8279-8bc23ef68b62","Type":"ContainerDied","Data":"95a691c501f48dc3248fa26337e8f1aad8806a5f25e5ac42eab5117080d76469"} Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.410006 4735 scope.go:117] "RemoveContainer" containerID="67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.483504 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c72925-26da-41c7-8279-8bc23ef68b62-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.490792 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.501393 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-674b457696-6r8nd"] Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.634509 4735 scope.go:117] "RemoveContainer" containerID="88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.675300 4735 scope.go:117] "RemoveContainer" containerID="67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5" Mar 17 01:30:57 crc kubenswrapper[4735]: E0317 01:30:57.677217 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5\": container with ID starting with 67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5 not found: ID does not exist" containerID="67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.677253 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5"} err="failed to get container status \"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5\": rpc error: code = NotFound desc = could not find container \"67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5\": container with ID starting with 67458265b57c3c25051b4559ee6fa17964acbf5e9be0df3017215f1ec6237de5 not found: ID does not exist" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.677278 4735 scope.go:117] "RemoveContainer" containerID="88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9" Mar 17 01:30:57 crc kubenswrapper[4735]: E0317 01:30:57.681417 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9\": container with ID starting with 88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9 not found: ID does not exist" containerID="88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.681446 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9"} err="failed to get container status \"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9\": rpc error: code = NotFound desc = could not find container \"88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9\": container with ID starting with 88c66f74e9e123b41f4448f4489b15ebd1c16088a0b4564a719ba3ea480cb4f9 not found: ID does not exist" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.734339 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.884969 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-544585c649-tthkx" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.891218 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle\") pod \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.891333 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data\") pod \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.891446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom\") pod \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.891480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw8hq\" (UniqueName: \"kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq\") pod \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\" (UID: \"8b38fe4e-49b4-40e8-88bb-150a1b13d936\") " Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.900543 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq" (OuterVolumeSpecName: "kube-api-access-sw8hq") pod "8b38fe4e-49b4-40e8-88bb-150a1b13d936" (UID: "8b38fe4e-49b4-40e8-88bb-150a1b13d936"). InnerVolumeSpecName "kube-api-access-sw8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.903117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b38fe4e-49b4-40e8-88bb-150a1b13d936" (UID: "8b38fe4e-49b4-40e8-88bb-150a1b13d936"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.968785 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data" (OuterVolumeSpecName: "config-data") pod "8b38fe4e-49b4-40e8-88bb-150a1b13d936" (UID: "8b38fe4e-49b4-40e8-88bb-150a1b13d936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.976050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b38fe4e-49b4-40e8-88bb-150a1b13d936" (UID: "8b38fe4e-49b4-40e8-88bb-150a1b13d936"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.994332 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.994458 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.994646 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b38fe4e-49b4-40e8-88bb-150a1b13d936-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:57 crc kubenswrapper[4735]: I0317 01:30:57.994676 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw8hq\" (UniqueName: \"kubernetes.io/projected/8b38fe4e-49b4-40e8-88bb-150a1b13d936-kube-api-access-sw8hq\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:58 crc kubenswrapper[4735]: I0317 01:30:58.416306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-778899646c-9ntf7" event={"ID":"8b38fe4e-49b4-40e8-88bb-150a1b13d936","Type":"ContainerDied","Data":"5ef36ba3c9f126694f9c7e3a5f1d9ef84b2a6c3ab2856ca8346c44147347bb6a"} Mar 17 01:30:58 crc kubenswrapper[4735]: I0317 01:30:58.416350 4735 scope.go:117] "RemoveContainer" containerID="52b587c5bf9a9e650ce5cdca18dbb8ff90c2503ce49afccca7588628551b6622" Mar 17 01:30:58 crc kubenswrapper[4735]: I0317 01:30:58.416403 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-778899646c-9ntf7" Mar 17 01:30:58 crc kubenswrapper[4735]: I0317 01:30:58.510980 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:58 crc kubenswrapper[4735]: I0317 01:30:58.520054 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-778899646c-9ntf7"] Mar 17 01:30:59 crc kubenswrapper[4735]: I0317 01:30:59.083445 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" path="/var/lib/kubelet/pods/32c72925-26da-41c7-8279-8bc23ef68b62/volumes" Mar 17 01:30:59 crc kubenswrapper[4735]: I0317 01:30:59.084513 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" path="/var/lib/kubelet/pods/8b38fe4e-49b4-40e8-88bb-150a1b13d936/volumes" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.635170 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.635217 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.675663 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6c568f7f98-2x6ww" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.684675 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.701438 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.740612 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:31:00 crc kubenswrapper[4735]: I0317 01:31:00.976283 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-55bb658757-9w8c7" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.022407 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.022615 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7c85bb6db7-nz59r" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerName="heat-engine" containerID="cri-o://9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" gracePeriod=60 Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.244983 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.353936 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle\") pod \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.354010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data\") pod \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.354051 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom\") pod \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.354086 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdr28\" (UniqueName: \"kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28\") pod \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\" (UID: \"3a289e78-a2ab-4533-ab26-d2b51d915eb2\") " Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.359961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28" (OuterVolumeSpecName: "kube-api-access-mdr28") pod "3a289e78-a2ab-4533-ab26-d2b51d915eb2" (UID: "3a289e78-a2ab-4533-ab26-d2b51d915eb2"). InnerVolumeSpecName "kube-api-access-mdr28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.363057 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a289e78-a2ab-4533-ab26-d2b51d915eb2" (UID: "3a289e78-a2ab-4533-ab26-d2b51d915eb2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.398439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a289e78-a2ab-4533-ab26-d2b51d915eb2" (UID: "3a289e78-a2ab-4533-ab26-d2b51d915eb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.410753 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data" (OuterVolumeSpecName: "config-data") pod "3a289e78-a2ab-4533-ab26-d2b51d915eb2" (UID: "3a289e78-a2ab-4533-ab26-d2b51d915eb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.449908 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-745b667557-vgprb" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.451281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-745b667557-vgprb" event={"ID":"3a289e78-a2ab-4533-ab26-d2b51d915eb2","Type":"ContainerDied","Data":"bb32de3403ddde0878a63cc909c651964d21b82a5c0b47806bd09b1fc33da447"} Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.451370 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.451390 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.451408 4735 scope.go:117] "RemoveContainer" containerID="b8996e592107492c6080c732c8a5efd406d82102267ed63eccff880f67d2f325" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.456340 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdr28\" (UniqueName: \"kubernetes.io/projected/3a289e78-a2ab-4533-ab26-d2b51d915eb2-kube-api-access-mdr28\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.456379 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.456393 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.456406 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a289e78-a2ab-4533-ab26-d2b51d915eb2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.512824 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.528299 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-745b667557-vgprb"] Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.551087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.551142 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.601300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:31:01 crc kubenswrapper[4735]: I0317 01:31:01.602198 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:31:02 crc kubenswrapper[4735]: E0317 01:31:02.293297 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 01:31:02 crc kubenswrapper[4735]: E0317 01:31:02.298668 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 01:31:02 crc kubenswrapper[4735]: E0317 01:31:02.300632 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 01:31:02 crc kubenswrapper[4735]: E0317 01:31:02.300696 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7c85bb6db7-nz59r" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerName="heat-engine" Mar 17 01:31:02 crc kubenswrapper[4735]: I0317 01:31:02.457316 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:31:02 crc kubenswrapper[4735]: I0317 01:31:02.457347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:31:03 crc kubenswrapper[4735]: I0317 01:31:03.083377 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" path="/var/lib/kubelet/pods/3a289e78-a2ab-4533-ab26-d2b51d915eb2/volumes" Mar 17 01:31:03 crc kubenswrapper[4735]: I0317 01:31:03.462259 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:31:03 crc kubenswrapper[4735]: I0317 01:31:03.462284 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.453392 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.453713 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.458298 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.592280 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.592378 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:31:05 crc kubenswrapper[4735]: I0317 01:31:05.984316 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.470202 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.470693 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-central-agent" containerID="cri-o://fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" gracePeriod=30 Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.472472 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="sg-core" containerID="cri-o://cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" gracePeriod=30 Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.472549 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="proxy-httpd" containerID="cri-o://573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" gracePeriod=30 Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.472598 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-notification-agent" containerID="cri-o://b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" gracePeriod=30 Mar 17 01:31:07 crc kubenswrapper[4735]: I0317 01:31:07.491397 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.494763 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499206 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" exitCode=0 Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499462 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" exitCode=2 Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499473 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" exitCode=0 Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499482 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" exitCode=0 Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerDied","Data":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499516 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerDied","Data":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerDied","Data":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerDied","Data":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8d56137-e804-41a6-add3-015e21d4e1ac","Type":"ContainerDied","Data":"79be2a7555dd215e2d45ec1bbafe88a1817685e5327172f174dd45da6d1c738d"} Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499565 4735 scope.go:117] "RemoveContainer" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.499266 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.529218 4735 scope.go:117] "RemoveContainer" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.574362 4735 scope.go:117] "RemoveContainer" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.604764 4735 scope.go:117] "RemoveContainer" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.621613 4735 scope.go:117] "RemoveContainer" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: E0317 01:31:08.622117 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": container with ID starting with 573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad not found: ID does not exist" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.622157 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} err="failed to get container status \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": rpc error: code = NotFound desc = could not find container \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": container with ID starting with 573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.622182 4735 scope.go:117] "RemoveContainer" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: E0317 01:31:08.622501 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": container with ID starting with cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c not found: ID does not exist" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.622589 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} err="failed to get container status \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": rpc error: code = NotFound desc = could not find container \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": container with ID starting with cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.622666 4735 scope.go:117] "RemoveContainer" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: E0317 01:31:08.623109 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": container with ID starting with b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7 not found: ID does not exist" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623140 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} err="failed to get container status \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": rpc error: code = NotFound desc = could not find container \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": container with ID starting with b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623166 4735 scope.go:117] "RemoveContainer" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: E0317 01:31:08.623357 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": container with ID starting with fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3 not found: ID does not exist" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623383 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} err="failed to get container status \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": rpc error: code = NotFound desc = could not find container \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": container with ID starting with fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623398 4735 scope.go:117] "RemoveContainer" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623596 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} err="failed to get container status \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": rpc error: code = NotFound desc = could not find container \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": container with ID starting with 573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.623617 4735 scope.go:117] "RemoveContainer" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.627515 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} err="failed to get container status \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": rpc error: code = NotFound desc = could not find container \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": container with ID starting with cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.627565 4735 scope.go:117] "RemoveContainer" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628001 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} err="failed to get container status \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": rpc error: code = NotFound desc = could not find container \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": container with ID starting with b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628023 4735 scope.go:117] "RemoveContainer" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628263 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} err="failed to get container status \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": rpc error: code = NotFound desc = could not find container \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": container with ID starting with fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628285 4735 scope.go:117] "RemoveContainer" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628577 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} err="failed to get container status \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": rpc error: code = NotFound desc = could not find container \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": container with ID starting with 573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628599 4735 scope.go:117] "RemoveContainer" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628792 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} err="failed to get container status \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": rpc error: code = NotFound desc = could not find container \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": container with ID starting with cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.628812 4735 scope.go:117] "RemoveContainer" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.629100 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} err="failed to get container status \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": rpc error: code = NotFound desc = could not find container \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": container with ID starting with b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.629119 4735 scope.go:117] "RemoveContainer" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.629346 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} err="failed to get container status \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": rpc error: code = NotFound desc = could not find container \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": container with ID starting with fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.629367 4735 scope.go:117] "RemoveContainer" containerID="573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.630378 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad"} err="failed to get container status \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": rpc error: code = NotFound desc = could not find container \"573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad\": container with ID starting with 573b240a8da69b79f7830a5ad934258a476e2946fa0047906321e2de03cd9cad not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.630463 4735 scope.go:117] "RemoveContainer" containerID="cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.631038 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c"} err="failed to get container status \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": rpc error: code = NotFound desc = could not find container \"cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c\": container with ID starting with cde5f638eb25e40d77a35638153510c0046837f9dc61ba38df650f8375ce090c not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.631060 4735 scope.go:117] "RemoveContainer" containerID="b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.631240 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7"} err="failed to get container status \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": rpc error: code = NotFound desc = could not find container \"b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7\": container with ID starting with b0077d22402b83eba81c64d6ecdf933d3d4aaa8a8fb7541e64a6bca67bfab4f7 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.631261 4735 scope.go:117] "RemoveContainer" containerID="fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.634061 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3"} err="failed to get container status \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": rpc error: code = NotFound desc = could not find container \"fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3\": container with ID starting with fc9bf053fe1c3e4e4248a5d8518588f1e8e0ceb129df89236a54190e2ab20ad3 not found: ID does not exist" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697707 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697773 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697806 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7vbz\" (UniqueName: \"kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697960 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.697986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd\") pod \"d8d56137-e804-41a6-add3-015e21d4e1ac\" (UID: \"d8d56137-e804-41a6-add3-015e21d4e1ac\") " Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.698830 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.698976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.724997 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz" (OuterVolumeSpecName: "kube-api-access-k7vbz") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "kube-api-access-k7vbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.725348 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts" (OuterVolumeSpecName: "scripts") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.819233 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.819265 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7vbz\" (UniqueName: \"kubernetes.io/projected/d8d56137-e804-41a6-add3-015e21d4e1ac-kube-api-access-k7vbz\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.819275 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.819283 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8d56137-e804-41a6-add3-015e21d4e1ac-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.844387 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.867067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.911096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data" (OuterVolumeSpecName: "config-data") pod "d8d56137-e804-41a6-add3-015e21d4e1ac" (UID: "d8d56137-e804-41a6-add3-015e21d4e1ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.923951 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.923986 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:08 crc kubenswrapper[4735]: I0317 01:31:08.923997 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8d56137-e804-41a6-add3-015e21d4e1ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.063718 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.134868 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mvwt\" (UniqueName: \"kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt\") pod \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.152780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle\") pod \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.154020 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom\") pod \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.154741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data\") pod \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\" (UID: \"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25\") " Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.197574 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt" (OuterVolumeSpecName: "kube-api-access-8mvwt") pod "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" (UID: "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25"). InnerVolumeSpecName "kube-api-access-8mvwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.213013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" (UID: "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.250239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" (UID: "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.250304 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.250338 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256021 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256382 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256399 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256415 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256422 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256432 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-notification-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256438 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-notification-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256447 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256453 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256461 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="sg-core" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256466 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="sg-core" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256475 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256480 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256492 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256497 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256508 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-central-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256513 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-central-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256524 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon-log" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256529 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon-log" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256537 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerName="heat-engine" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256542 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerName="heat-engine" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256549 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="init" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256555 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="init" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256566 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="dnsmasq-dns" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256572 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="dnsmasq-dns" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.256585 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="proxy-httpd" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="proxy-httpd" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256932 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="sg-core" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256948 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256959 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon-log" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256968 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerName="heat-engine" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256981 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="proxy-httpd" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.256989 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c72925-26da-41c7-8279-8bc23ef68b62" containerName="horizon" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257029 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257040 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a289e78-a2ab-4533-ab26-d2b51d915eb2" containerName="heat-api" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257050 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b38fe4e-49b4-40e8-88bb-150a1b13d936" containerName="heat-cfnapi" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257061 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-notification-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257068 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee3a5db-edbc-4df1-8087-2c6630561bb2" containerName="dnsmasq-dns" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.257098 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" containerName="ceilometer-central-agent" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.259173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.263038 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.263255 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.281734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.286635 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.286664 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.286673 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mvwt\" (UniqueName: \"kubernetes.io/projected/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-kube-api-access-8mvwt\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.303029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data" (OuterVolumeSpecName: "config-data") pod "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" (UID: "c950c4e3-7e39-47dd-b15f-0a9d20ca7f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.388834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389229 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389328 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjv7\" (UniqueName: \"kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389475 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.389680 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjv7\" (UniqueName: \"kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.491901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.492648 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.493587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.495537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.497738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.498342 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.500181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.509350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjv7\" (UniqueName: \"kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7\") pod \"ceilometer-0\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " pod="openstack/ceilometer-0" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.514297 4735 generic.go:334] "Generic (PLEG): container finished" podID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" exitCode=0 Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.514351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7c85bb6db7-nz59r" event={"ID":"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25","Type":"ContainerDied","Data":"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be"} Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.514386 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7c85bb6db7-nz59r" event={"ID":"c950c4e3-7e39-47dd-b15f-0a9d20ca7f25","Type":"ContainerDied","Data":"4891417cfbd7e37b5ef44bc89c69c9a75bd835f06d210a3d974f5414c82292b8"} Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.514403 4735 scope.go:117] "RemoveContainer" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.514525 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7c85bb6db7-nz59r" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.545018 4735 scope.go:117] "RemoveContainer" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" Mar 17 01:31:09 crc kubenswrapper[4735]: E0317 01:31:09.545394 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be\": container with ID starting with 9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be not found: ID does not exist" containerID="9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.545418 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be"} err="failed to get container status \"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be\": rpc error: code = NotFound desc = could not find container \"9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be\": container with ID starting with 9ee842d0e9e7e2dfb6edc5ac65f18aef796bdf6b8630b1935c7cde4f8a6a76be not found: ID does not exist" Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.576657 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.583474 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7c85bb6db7-nz59r"] Mar 17 01:31:09 crc kubenswrapper[4735]: I0317 01:31:09.585954 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:10 crc kubenswrapper[4735]: I0317 01:31:10.108141 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:10 crc kubenswrapper[4735]: I0317 01:31:10.390962 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:10 crc kubenswrapper[4735]: I0317 01:31:10.524828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerStarted","Data":"3abd3417ee50deb5f6a859fea1324dffbe63a87f18de5d5ce6dcba5877dc0f31"} Mar 17 01:31:11 crc kubenswrapper[4735]: I0317 01:31:11.082699 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c950c4e3-7e39-47dd-b15f-0a9d20ca7f25" path="/var/lib/kubelet/pods/c950c4e3-7e39-47dd-b15f-0a9d20ca7f25/volumes" Mar 17 01:31:11 crc kubenswrapper[4735]: I0317 01:31:11.083539 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d56137-e804-41a6-add3-015e21d4e1ac" path="/var/lib/kubelet/pods/d8d56137-e804-41a6-add3-015e21d4e1ac/volumes" Mar 17 01:31:11 crc kubenswrapper[4735]: I0317 01:31:11.535325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerStarted","Data":"cfff3c9f4322e15dd50215598d9783a2f754e93596d7f70f3b4b7cf722ee6405"} Mar 17 01:31:11 crc kubenswrapper[4735]: I0317 01:31:11.535367 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerStarted","Data":"afdc10adc79daad84ffc11a3d09df274a43ac7ab029a859c895f7d3c57d1673b"} Mar 17 01:31:12 crc kubenswrapper[4735]: I0317 01:31:12.554383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerStarted","Data":"99556957eb65b3c4b24064be3ee4c17bf17883a75397c1c8f3cad2e615049c45"} Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.569849 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerStarted","Data":"371df8c7b7e289944321bc529b897a02d51a31c7991b5c9367890baa3b185225"} Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.570306 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.570072 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-notification-agent" containerID="cri-o://afdc10adc79daad84ffc11a3d09df274a43ac7ab029a859c895f7d3c57d1673b" gracePeriod=30 Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.570093 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="sg-core" containerID="cri-o://99556957eb65b3c4b24064be3ee4c17bf17883a75397c1c8f3cad2e615049c45" gracePeriod=30 Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.570301 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-central-agent" containerID="cri-o://cfff3c9f4322e15dd50215598d9783a2f754e93596d7f70f3b4b7cf722ee6405" gracePeriod=30 Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.570081 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="proxy-httpd" containerID="cri-o://371df8c7b7e289944321bc529b897a02d51a31c7991b5c9367890baa3b185225" gracePeriod=30 Mar 17 01:31:14 crc kubenswrapper[4735]: I0317 01:31:14.596567 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.728538476 podStartE2EDuration="5.59654997s" podCreationTimestamp="2026-03-17 01:31:09 +0000 UTC" firstStartedPulling="2026-03-17 01:31:10.120465434 +0000 UTC m=+1295.752698402" lastFinishedPulling="2026-03-17 01:31:13.988476918 +0000 UTC m=+1299.620709896" observedRunningTime="2026-03-17 01:31:14.593997078 +0000 UTC m=+1300.226230056" watchObservedRunningTime="2026-03-17 01:31:14.59654997 +0000 UTC m=+1300.228782948" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.593976 4735 generic.go:334] "Generic (PLEG): container finished" podID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerID="371df8c7b7e289944321bc529b897a02d51a31c7991b5c9367890baa3b185225" exitCode=0 Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594198 4735 generic.go:334] "Generic (PLEG): container finished" podID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerID="99556957eb65b3c4b24064be3ee4c17bf17883a75397c1c8f3cad2e615049c45" exitCode=2 Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594206 4735 generic.go:334] "Generic (PLEG): container finished" podID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerID="afdc10adc79daad84ffc11a3d09df274a43ac7ab029a859c895f7d3c57d1673b" exitCode=0 Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594213 4735 generic.go:334] "Generic (PLEG): container finished" podID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerID="cfff3c9f4322e15dd50215598d9783a2f754e93596d7f70f3b4b7cf722ee6405" exitCode=0 Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerDied","Data":"371df8c7b7e289944321bc529b897a02d51a31c7991b5c9367890baa3b185225"} Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerDied","Data":"99556957eb65b3c4b24064be3ee4c17bf17883a75397c1c8f3cad2e615049c45"} Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerDied","Data":"afdc10adc79daad84ffc11a3d09df274a43ac7ab029a859c895f7d3c57d1673b"} Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.594272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerDied","Data":"cfff3c9f4322e15dd50215598d9783a2f754e93596d7f70f3b4b7cf722ee6405"} Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.833625 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.916987 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917268 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hjv7\" (UniqueName: \"kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917563 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917660 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.917833 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd\") pod \"634c3d44-5160-4d84-9864-4cb87d7c6124\" (UID: \"634c3d44-5160-4d84-9864-4cb87d7c6124\") " Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.918565 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.921501 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.928009 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts" (OuterVolumeSpecName: "scripts") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.928050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7" (OuterVolumeSpecName: "kube-api-access-2hjv7") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "kube-api-access-2hjv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:15 crc kubenswrapper[4735]: I0317 01:31:15.952336 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.001040 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.019620 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.019777 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hjv7\" (UniqueName: \"kubernetes.io/projected/634c3d44-5160-4d84-9864-4cb87d7c6124-kube-api-access-2hjv7\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.019885 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.019973 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.020418 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.020507 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/634c3d44-5160-4d84-9864-4cb87d7c6124-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.026017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data" (OuterVolumeSpecName: "config-data") pod "634c3d44-5160-4d84-9864-4cb87d7c6124" (UID: "634c3d44-5160-4d84-9864-4cb87d7c6124"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.124127 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c3d44-5160-4d84-9864-4cb87d7c6124-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.612946 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"634c3d44-5160-4d84-9864-4cb87d7c6124","Type":"ContainerDied","Data":"3abd3417ee50deb5f6a859fea1324dffbe63a87f18de5d5ce6dcba5877dc0f31"} Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.613588 4735 scope.go:117] "RemoveContainer" containerID="371df8c7b7e289944321bc529b897a02d51a31c7991b5c9367890baa3b185225" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.613792 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.650816 4735 scope.go:117] "RemoveContainer" containerID="99556957eb65b3c4b24064be3ee4c17bf17883a75397c1c8f3cad2e615049c45" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.686070 4735 scope.go:117] "RemoveContainer" containerID="afdc10adc79daad84ffc11a3d09df274a43ac7ab029a859c895f7d3c57d1673b" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.691452 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.704128 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.713326 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:16 crc kubenswrapper[4735]: E0317 01:31:16.713728 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="sg-core" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.713739 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="sg-core" Mar 17 01:31:16 crc kubenswrapper[4735]: E0317 01:31:16.713759 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-notification-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.713765 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-notification-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: E0317 01:31:16.713776 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="proxy-httpd" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.713782 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="proxy-httpd" Mar 17 01:31:16 crc kubenswrapper[4735]: E0317 01:31:16.713801 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-central-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.713806 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-central-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.714005 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="proxy-httpd" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.714020 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-notification-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.714032 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="ceilometer-central-agent" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.714048 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" containerName="sg-core" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.715606 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.719777 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.719995 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.723950 4735 scope.go:117] "RemoveContainer" containerID="cfff3c9f4322e15dd50215598d9783a2f754e93596d7f70f3b4b7cf722ee6405" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.727036 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.837679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzpq4\" (UniqueName: \"kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.837723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.837867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.837961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.837993 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.838207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.838277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939721 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939786 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.939844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzpq4\" (UniqueName: \"kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.941167 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.943113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.947726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.948145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.954877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.959310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:16 crc kubenswrapper[4735]: I0317 01:31:16.960137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzpq4\" (UniqueName: \"kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4\") pod \"ceilometer-0\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " pod="openstack/ceilometer-0" Mar 17 01:31:17 crc kubenswrapper[4735]: I0317 01:31:17.041142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:17 crc kubenswrapper[4735]: I0317 01:31:17.090552 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634c3d44-5160-4d84-9864-4cb87d7c6124" path="/var/lib/kubelet/pods/634c3d44-5160-4d84-9864-4cb87d7c6124/volumes" Mar 17 01:31:17 crc kubenswrapper[4735]: I0317 01:31:17.475539 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:17 crc kubenswrapper[4735]: I0317 01:31:17.623494 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerStarted","Data":"9c7083d73c0f48a2d41e2a3c46fb75e1925a1f71ae67105831e19d0ce4611723"} Mar 17 01:31:17 crc kubenswrapper[4735]: I0317 01:31:17.951937 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.320388 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gxtdn"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.321823 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.328896 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gxtdn"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.366451 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmgr\" (UniqueName: \"kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.366912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.416928 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4cg78"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.418005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.426337 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4cg78"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.449106 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4d4c-account-create-update-sc92d"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.451595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.453814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.468554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.468594 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.468618 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmgr\" (UniqueName: \"kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.468668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mktq\" (UniqueName: \"kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.471875 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.476922 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4d4c-account-create-update-sc92d"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.492571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmgr\" (UniqueName: \"kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr\") pod \"nova-api-db-create-gxtdn\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.591631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdbs\" (UniqueName: \"kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.591875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.591966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.592069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mktq\" (UniqueName: \"kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.594099 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.612177 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-s8vgv"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.614164 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.642240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.670596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mktq\" (UniqueName: \"kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq\") pod \"nova-cell0-db-create-4cg78\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.702950 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.715391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.717296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mddfb\" (UniqueName: \"kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.717659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdbs\" (UniqueName: \"kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.717112 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s8vgv"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.720155 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.734168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.749305 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-990a-account-create-update-6chjv"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.750489 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.763082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdbs\" (UniqueName: \"kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs\") pod \"nova-api-4d4c-account-create-update-sc92d\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.765756 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.767000 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-990a-account-create-update-6chjv"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.771398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.826517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnhk\" (UniqueName: \"kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.826642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.826690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mddfb\" (UniqueName: \"kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.826716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.827994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.828012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerStarted","Data":"c4bef92ca3a128046caf857030e0474dfc9ea4242a69816ddc0773484597589d"} Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.828052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerStarted","Data":"ee474c860365e8fea7d9f8e97e450cb365802e7bec30bd06aad92b1b89cad0b1"} Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.855095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mddfb\" (UniqueName: \"kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb\") pod \"nova-cell1-db-create-s8vgv\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.883569 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0213-account-create-update-cls5l"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.884809 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.889541 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.908684 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0213-account-create-update-cls5l"] Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.930147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xnhk\" (UniqueName: \"kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.930834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.931566 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:18 crc kubenswrapper[4735]: I0317 01:31:18.956545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xnhk\" (UniqueName: \"kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk\") pod \"nova-cell0-990a-account-create-update-6chjv\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.013295 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.033071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rv6\" (UniqueName: \"kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.033195 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.134636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.134763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rv6\" (UniqueName: \"kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.144192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.148250 4735 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod55f65e0a-d223-4c97-8911-d903950feb61"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod55f65e0a-d223-4c97-8911-d903950feb61] : Timed out while waiting for systemd to remove kubepods-besteffort-pod55f65e0a_d223_4c97_8911_d903950feb61.slice" Mar 17 01:31:19 crc kubenswrapper[4735]: E0317 01:31:19.148979 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod55f65e0a-d223-4c97-8911-d903950feb61] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod55f65e0a-d223-4c97-8911-d903950feb61] : Timed out while waiting for systemd to remove kubepods-besteffort-pod55f65e0a_d223_4c97_8911_d903950feb61.slice" pod="openstack/neutron-fbbbfb4-v48lt" podUID="55f65e0a-d223-4c97-8911-d903950feb61" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.172507 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.204604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rv6\" (UniqueName: \"kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6\") pod \"nova-cell1-0213-account-create-update-cls5l\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.216074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.561556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gxtdn"] Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.699699 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4cg78"] Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.842180 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s8vgv"] Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.865448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cg78" event={"ID":"c423ac16-ac60-4ad3-9a8e-a1e5be701162","Type":"ContainerStarted","Data":"3a40987d0915727709bb8fc01584afa04ce8986851efe0f2be9b51a9d72156cc"} Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.880898 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4d4c-account-create-update-sc92d"] Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.885214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gxtdn" event={"ID":"02edf717-dfa9-4226-b695-f438427bc8a4","Type":"ContainerStarted","Data":"de0b6c23dc85d3427b45c37eaa4f6970377697ff178a86a90f5e81d31ec4fa84"} Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.885283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gxtdn" event={"ID":"02edf717-dfa9-4226-b695-f438427bc8a4","Type":"ContainerStarted","Data":"507ec35a5735c49089a29ff5c325450f1d97ff1d27cf67baec8f72a4a64ac11a"} Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.897596 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbbbfb4-v48lt" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.898312 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerStarted","Data":"55f313f2b4136041b0680c29c19cc4eca109d9b33423f6a4c4de83375e77e209"} Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.912406 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-gxtdn" podStartSLOduration=1.912385722 podStartE2EDuration="1.912385722s" podCreationTimestamp="2026-03-17 01:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:31:19.902144812 +0000 UTC m=+1305.534377790" watchObservedRunningTime="2026-03-17 01:31:19.912385722 +0000 UTC m=+1305.544618700" Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.951741 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:31:19 crc kubenswrapper[4735]: I0317 01:31:19.977107 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fbbbfb4-v48lt"] Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.015976 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-990a-account-create-update-6chjv"] Mar 17 01:31:20 crc kubenswrapper[4735]: W0317 01:31:20.026358 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod387441db_b51a_4a3d_b205_bb2f951a8700.slice/crio-ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35 WatchSource:0}: Error finding container ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35: Status 404 returned error can't find the container with id ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.033605 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0213-account-create-update-cls5l"] Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.919248 4735 generic.go:334] "Generic (PLEG): container finished" podID="c423ac16-ac60-4ad3-9a8e-a1e5be701162" containerID="1e877041438750a34b04eac5a19fdce50746446ec71044fcedf076ce749be50e" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.921570 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cg78" event={"ID":"c423ac16-ac60-4ad3-9a8e-a1e5be701162","Type":"ContainerDied","Data":"1e877041438750a34b04eac5a19fdce50746446ec71044fcedf076ce749be50e"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.933244 4735 generic.go:334] "Generic (PLEG): container finished" podID="387441db-b51a-4a3d-b205-bb2f951a8700" containerID="ee78b8cf6c1f046e3c87d5345bc812f9639f13d2a6dcdc7f00d688e9632a96bf" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.933325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-990a-account-create-update-6chjv" event={"ID":"387441db-b51a-4a3d-b205-bb2f951a8700","Type":"ContainerDied","Data":"ee78b8cf6c1f046e3c87d5345bc812f9639f13d2a6dcdc7f00d688e9632a96bf"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.933351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-990a-account-create-update-6chjv" event={"ID":"387441db-b51a-4a3d-b205-bb2f951a8700","Type":"ContainerStarted","Data":"ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.951212 4735 generic.go:334] "Generic (PLEG): container finished" podID="7202ec2d-61ea-45a1-9992-e48fdf57e0db" containerID="26657117c1551f606db97aa91b33829d84fcf2132b80e63543db2cc90903f31f" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.951295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s8vgv" event={"ID":"7202ec2d-61ea-45a1-9992-e48fdf57e0db","Type":"ContainerDied","Data":"26657117c1551f606db97aa91b33829d84fcf2132b80e63543db2cc90903f31f"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.951320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s8vgv" event={"ID":"7202ec2d-61ea-45a1-9992-e48fdf57e0db","Type":"ContainerStarted","Data":"417684ed5531ba0e3ca3dbbd49d60695c1a8540f1267ae317ae2416b111c732a"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.952756 4735 generic.go:334] "Generic (PLEG): container finished" podID="80c1db3e-740b-434d-b7bf-1938df4f05e2" containerID="72919f521794f74d828599f9cde0ac0b3392679ea7db2e8a5b8e1cd7aae150f7" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.952838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4d4c-account-create-update-sc92d" event={"ID":"80c1db3e-740b-434d-b7bf-1938df4f05e2","Type":"ContainerDied","Data":"72919f521794f74d828599f9cde0ac0b3392679ea7db2e8a5b8e1cd7aae150f7"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.952937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4d4c-account-create-update-sc92d" event={"ID":"80c1db3e-740b-434d-b7bf-1938df4f05e2","Type":"ContainerStarted","Data":"896ba8caf442a676e16bb85007f918a7aada1a5d2914965b43139290d4d4d99b"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.960679 4735 generic.go:334] "Generic (PLEG): container finished" podID="b0148389-8934-4f76-8ba8-bd589163623d" containerID="10c9beb590668ce43d1ee2c9b51153dbb13efc6ca2535ace55b3175da605f1a8" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.960754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0213-account-create-update-cls5l" event={"ID":"b0148389-8934-4f76-8ba8-bd589163623d","Type":"ContainerDied","Data":"10c9beb590668ce43d1ee2c9b51153dbb13efc6ca2535ace55b3175da605f1a8"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.960781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0213-account-create-update-cls5l" event={"ID":"b0148389-8934-4f76-8ba8-bd589163623d","Type":"ContainerStarted","Data":"e9c9ad4009a11115a0520272f8642f2d20595021b5a76fe87ef4ffe291401ed8"} Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.992395 4735 generic.go:334] "Generic (PLEG): container finished" podID="02edf717-dfa9-4226-b695-f438427bc8a4" containerID="de0b6c23dc85d3427b45c37eaa4f6970377697ff178a86a90f5e81d31ec4fa84" exitCode=0 Mar 17 01:31:20 crc kubenswrapper[4735]: I0317 01:31:20.992507 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gxtdn" event={"ID":"02edf717-dfa9-4226-b695-f438427bc8a4","Type":"ContainerDied","Data":"de0b6c23dc85d3427b45c37eaa4f6970377697ff178a86a90f5e81d31ec4fa84"} Mar 17 01:31:21 crc kubenswrapper[4735]: I0317 01:31:21.090070 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f65e0a-d223-4c97-8911-d903950feb61" path="/var/lib/kubelet/pods/55f65e0a-d223-4c97-8911-d903950feb61/volumes" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.002522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerStarted","Data":"454fb4fe62491f6c6395ce8a39d2bd86839634d4b88f3e0b7c1c762b12611779"} Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.003297 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-central-agent" containerID="cri-o://ee474c860365e8fea7d9f8e97e450cb365802e7bec30bd06aad92b1b89cad0b1" gracePeriod=30 Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.003888 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="proxy-httpd" containerID="cri-o://454fb4fe62491f6c6395ce8a39d2bd86839634d4b88f3e0b7c1c762b12611779" gracePeriod=30 Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.003980 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="sg-core" containerID="cri-o://55f313f2b4136041b0680c29c19cc4eca109d9b33423f6a4c4de83375e77e209" gracePeriod=30 Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.004018 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-notification-agent" containerID="cri-o://c4bef92ca3a128046caf857030e0474dfc9ea4242a69816ddc0773484597589d" gracePeriod=30 Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.478311 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.507567 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.646033267 podStartE2EDuration="6.507549102s" podCreationTimestamp="2026-03-17 01:31:16 +0000 UTC" firstStartedPulling="2026-03-17 01:31:17.486798126 +0000 UTC m=+1303.119031104" lastFinishedPulling="2026-03-17 01:31:21.348313961 +0000 UTC m=+1306.980546939" observedRunningTime="2026-03-17 01:31:22.049106564 +0000 UTC m=+1307.681339542" watchObservedRunningTime="2026-03-17 01:31:22.507549102 +0000 UTC m=+1308.139782070" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.516465 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48rv6\" (UniqueName: \"kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6\") pod \"b0148389-8934-4f76-8ba8-bd589163623d\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.516656 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts\") pod \"b0148389-8934-4f76-8ba8-bd589163623d\" (UID: \"b0148389-8934-4f76-8ba8-bd589163623d\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.517280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0148389-8934-4f76-8ba8-bd589163623d" (UID: "b0148389-8934-4f76-8ba8-bd589163623d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.550014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6" (OuterVolumeSpecName: "kube-api-access-48rv6") pod "b0148389-8934-4f76-8ba8-bd589163623d" (UID: "b0148389-8934-4f76-8ba8-bd589163623d"). InnerVolumeSpecName "kube-api-access-48rv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.619725 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48rv6\" (UniqueName: \"kubernetes.io/projected/b0148389-8934-4f76-8ba8-bd589163623d-kube-api-access-48rv6\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.619762 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0148389-8934-4f76-8ba8-bd589163623d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.819827 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.868158 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.871962 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.888465 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.921002 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts\") pod \"02edf717-dfa9-4226-b695-f438427bc8a4\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938330 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts\") pod \"387441db-b51a-4a3d-b205-bb2f951a8700\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts\") pod \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfmgr\" (UniqueName: \"kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr\") pod \"02edf717-dfa9-4226-b695-f438427bc8a4\" (UID: \"02edf717-dfa9-4226-b695-f438427bc8a4\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mktq\" (UniqueName: \"kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq\") pod \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\" (UID: \"c423ac16-ac60-4ad3-9a8e-a1e5be701162\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.938632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xnhk\" (UniqueName: \"kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk\") pod \"387441db-b51a-4a3d-b205-bb2f951a8700\" (UID: \"387441db-b51a-4a3d-b205-bb2f951a8700\") " Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.940095 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c423ac16-ac60-4ad3-9a8e-a1e5be701162" (UID: "c423ac16-ac60-4ad3-9a8e-a1e5be701162"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.940535 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02edf717-dfa9-4226-b695-f438427bc8a4" (UID: "02edf717-dfa9-4226-b695-f438427bc8a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.940985 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "387441db-b51a-4a3d-b205-bb2f951a8700" (UID: "387441db-b51a-4a3d-b205-bb2f951a8700"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.948072 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk" (OuterVolumeSpecName: "kube-api-access-9xnhk") pod "387441db-b51a-4a3d-b205-bb2f951a8700" (UID: "387441db-b51a-4a3d-b205-bb2f951a8700"). InnerVolumeSpecName "kube-api-access-9xnhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.950013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr" (OuterVolumeSpecName: "kube-api-access-hfmgr") pod "02edf717-dfa9-4226-b695-f438427bc8a4" (UID: "02edf717-dfa9-4226-b695-f438427bc8a4"). InnerVolumeSpecName "kube-api-access-hfmgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:22 crc kubenswrapper[4735]: I0317 01:31:22.951168 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq" (OuterVolumeSpecName: "kube-api-access-5mktq") pod "c423ac16-ac60-4ad3-9a8e-a1e5be701162" (UID: "c423ac16-ac60-4ad3-9a8e-a1e5be701162"). InnerVolumeSpecName "kube-api-access-5mktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.010355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4d4c-account-create-update-sc92d" event={"ID":"80c1db3e-740b-434d-b7bf-1938df4f05e2","Type":"ContainerDied","Data":"896ba8caf442a676e16bb85007f918a7aada1a5d2914965b43139290d4d4d99b"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.010392 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="896ba8caf442a676e16bb85007f918a7aada1a5d2914965b43139290d4d4d99b" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.010443 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4d4c-account-create-update-sc92d" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.011732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0213-account-create-update-cls5l" event={"ID":"b0148389-8934-4f76-8ba8-bd589163623d","Type":"ContainerDied","Data":"e9c9ad4009a11115a0520272f8642f2d20595021b5a76fe87ef4ffe291401ed8"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.011751 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9c9ad4009a11115a0520272f8642f2d20595021b5a76fe87ef4ffe291401ed8" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.011794 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0213-account-create-update-cls5l" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.020472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gxtdn" event={"ID":"02edf717-dfa9-4226-b695-f438427bc8a4","Type":"ContainerDied","Data":"507ec35a5735c49089a29ff5c325450f1d97ff1d27cf67baec8f72a4a64ac11a"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.020508 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507ec35a5735c49089a29ff5c325450f1d97ff1d27cf67baec8f72a4a64ac11a" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.020565 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gxtdn" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028530 4735 generic.go:334] "Generic (PLEG): container finished" podID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerID="454fb4fe62491f6c6395ce8a39d2bd86839634d4b88f3e0b7c1c762b12611779" exitCode=0 Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028637 4735 generic.go:334] "Generic (PLEG): container finished" podID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerID="55f313f2b4136041b0680c29c19cc4eca109d9b33423f6a4c4de83375e77e209" exitCode=2 Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028697 4735 generic.go:334] "Generic (PLEG): container finished" podID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerID="c4bef92ca3a128046caf857030e0474dfc9ea4242a69816ddc0773484597589d" exitCode=0 Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028747 4735 generic.go:334] "Generic (PLEG): container finished" podID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerID="ee474c860365e8fea7d9f8e97e450cb365802e7bec30bd06aad92b1b89cad0b1" exitCode=0 Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerDied","Data":"454fb4fe62491f6c6395ce8a39d2bd86839634d4b88f3e0b7c1c762b12611779"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.028935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerDied","Data":"55f313f2b4136041b0680c29c19cc4eca109d9b33423f6a4c4de83375e77e209"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.029006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerDied","Data":"c4bef92ca3a128046caf857030e0474dfc9ea4242a69816ddc0773484597589d"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.029064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerDied","Data":"ee474c860365e8fea7d9f8e97e450cb365802e7bec30bd06aad92b1b89cad0b1"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.041562 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdbs\" (UniqueName: \"kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs\") pod \"80c1db3e-740b-434d-b7bf-1938df4f05e2\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.041754 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts\") pod \"80c1db3e-740b-434d-b7bf-1938df4f05e2\" (UID: \"80c1db3e-740b-434d-b7bf-1938df4f05e2\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.041980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts\") pod \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mddfb\" (UniqueName: \"kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb\") pod \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\" (UID: \"7202ec2d-61ea-45a1-9992-e48fdf57e0db\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042573 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mktq\" (UniqueName: \"kubernetes.io/projected/c423ac16-ac60-4ad3-9a8e-a1e5be701162-kube-api-access-5mktq\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042639 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xnhk\" (UniqueName: \"kubernetes.io/projected/387441db-b51a-4a3d-b205-bb2f951a8700-kube-api-access-9xnhk\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042695 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02edf717-dfa9-4226-b695-f438427bc8a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042756 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387441db-b51a-4a3d-b205-bb2f951a8700-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042810 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c423ac16-ac60-4ad3-9a8e-a1e5be701162-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.042916 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfmgr\" (UniqueName: \"kubernetes.io/projected/02edf717-dfa9-4226-b695-f438427bc8a4-kube-api-access-hfmgr\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.046611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7202ec2d-61ea-45a1-9992-e48fdf57e0db" (UID: "7202ec2d-61ea-45a1-9992-e48fdf57e0db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.047342 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80c1db3e-740b-434d-b7bf-1938df4f05e2" (UID: "80c1db3e-740b-434d-b7bf-1938df4f05e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.051204 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cg78" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.051339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cg78" event={"ID":"c423ac16-ac60-4ad3-9a8e-a1e5be701162","Type":"ContainerDied","Data":"3a40987d0915727709bb8fc01584afa04ce8986851efe0f2be9b51a9d72156cc"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.051387 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a40987d0915727709bb8fc01584afa04ce8986851efe0f2be9b51a9d72156cc" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.066832 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb" (OuterVolumeSpecName: "kube-api-access-mddfb") pod "7202ec2d-61ea-45a1-9992-e48fdf57e0db" (UID: "7202ec2d-61ea-45a1-9992-e48fdf57e0db"). InnerVolumeSpecName "kube-api-access-mddfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.070930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-990a-account-create-update-6chjv" event={"ID":"387441db-b51a-4a3d-b205-bb2f951a8700","Type":"ContainerDied","Data":"ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.070962 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea706ae3b86f0fe700b48f076797eedc64cf95a9a76753ddd1f45c1d8a214f35" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.071012 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-990a-account-create-update-6chjv" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.075998 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs" (OuterVolumeSpecName: "kube-api-access-xmdbs") pod "80c1db3e-740b-434d-b7bf-1938df4f05e2" (UID: "80c1db3e-740b-434d-b7bf-1938df4f05e2"). InnerVolumeSpecName "kube-api-access-xmdbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.078462 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s8vgv" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.088410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s8vgv" event={"ID":"7202ec2d-61ea-45a1-9992-e48fdf57e0db","Type":"ContainerDied","Data":"417684ed5531ba0e3ca3dbbd49d60695c1a8540f1267ae317ae2416b111c732a"} Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.088450 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="417684ed5531ba0e3ca3dbbd49d60695c1a8540f1267ae317ae2416b111c732a" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.145034 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdbs\" (UniqueName: \"kubernetes.io/projected/80c1db3e-740b-434d-b7bf-1938df4f05e2-kube-api-access-xmdbs\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.145057 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80c1db3e-740b-434d-b7bf-1938df4f05e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.145066 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7202ec2d-61ea-45a1-9992-e48fdf57e0db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.145074 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mddfb\" (UniqueName: \"kubernetes.io/projected/7202ec2d-61ea-45a1-9992-e48fdf57e0db-kube-api-access-mddfb\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.688560 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.754693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzpq4\" (UniqueName: \"kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.754789 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.754943 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.754968 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.755040 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.755108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.755176 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd\") pod \"20dc1c3e-b968-4c74-982e-bccffcdfa318\" (UID: \"20dc1c3e-b968-4c74-982e-bccffcdfa318\") " Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.755462 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.755814 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.758570 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.794048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4" (OuterVolumeSpecName: "kube-api-access-vzpq4") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "kube-api-access-vzpq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.796125 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts" (OuterVolumeSpecName: "scripts") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.865745 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.865774 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20dc1c3e-b968-4c74-982e-bccffcdfa318-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.865784 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzpq4\" (UniqueName: \"kubernetes.io/projected/20dc1c3e-b968-4c74-982e-bccffcdfa318-kube-api-access-vzpq4\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.881933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.885951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.919590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data" (OuterVolumeSpecName: "config-data") pod "20dc1c3e-b968-4c74-982e-bccffcdfa318" (UID: "20dc1c3e-b968-4c74-982e-bccffcdfa318"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.967663 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.967697 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:23 crc kubenswrapper[4735]: I0317 01:31:23.967735 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20dc1c3e-b968-4c74-982e-bccffcdfa318-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.087710 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20dc1c3e-b968-4c74-982e-bccffcdfa318","Type":"ContainerDied","Data":"9c7083d73c0f48a2d41e2a3c46fb75e1925a1f71ae67105831e19d0ce4611723"} Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.087762 4735 scope.go:117] "RemoveContainer" containerID="454fb4fe62491f6c6395ce8a39d2bd86839634d4b88f3e0b7c1c762b12611779" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.087781 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.108257 4735 scope.go:117] "RemoveContainer" containerID="55f313f2b4136041b0680c29c19cc4eca109d9b33423f6a4c4de83375e77e209" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.120179 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.124035 4735 scope.go:117] "RemoveContainer" containerID="c4bef92ca3a128046caf857030e0474dfc9ea4242a69816ddc0773484597589d" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.129841 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.144235 4735 scope.go:117] "RemoveContainer" containerID="ee474c860365e8fea7d9f8e97e450cb365802e7bec30bd06aad92b1b89cad0b1" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.149429 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152022 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="sg-core" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152102 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="sg-core" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152159 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7202ec2d-61ea-45a1-9992-e48fdf57e0db" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7202ec2d-61ea-45a1-9992-e48fdf57e0db" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152302 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0148389-8934-4f76-8ba8-bd589163623d" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152355 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0148389-8934-4f76-8ba8-bd589163623d" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152416 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c423ac16-ac60-4ad3-9a8e-a1e5be701162" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152465 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c423ac16-ac60-4ad3-9a8e-a1e5be701162" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152522 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c1db3e-740b-434d-b7bf-1938df4f05e2" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152578 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c1db3e-740b-434d-b7bf-1938df4f05e2" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152640 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="proxy-httpd" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152696 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="proxy-httpd" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152750 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02edf717-dfa9-4226-b695-f438427bc8a4" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152804 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="02edf717-dfa9-4226-b695-f438427bc8a4" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152873 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-central-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.152933 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-central-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.152993 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-notification-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153043 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-notification-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: E0317 01:31:24.153114 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387441db-b51a-4a3d-b205-bb2f951a8700" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153165 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="387441db-b51a-4a3d-b205-bb2f951a8700" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153400 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="387441db-b51a-4a3d-b205-bb2f951a8700" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153471 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="02edf717-dfa9-4226-b695-f438427bc8a4" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153529 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c423ac16-ac60-4ad3-9a8e-a1e5be701162" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153582 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7202ec2d-61ea-45a1-9992-e48fdf57e0db" containerName="mariadb-database-create" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153635 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0148389-8934-4f76-8ba8-bd589163623d" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153687 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-notification-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153751 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c1db3e-740b-434d-b7bf-1938df4f05e2" containerName="mariadb-account-create-update" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.153807 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="proxy-httpd" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.155363 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="sg-core" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.155804 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" containerName="ceilometer-central-agent" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.158179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.162558 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.162819 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.167433 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gjbr\" (UniqueName: \"kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273816 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.273979 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.375635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gjbr\" (UniqueName: \"kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.375933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.376558 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.381423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.381633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.384305 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.384464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.395440 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gjbr\" (UniqueName: \"kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr\") pod \"ceilometer-0\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.472634 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:24 crc kubenswrapper[4735]: I0317 01:31:24.990317 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:25 crc kubenswrapper[4735]: I0317 01:31:25.083053 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20dc1c3e-b968-4c74-982e-bccffcdfa318" path="/var/lib/kubelet/pods/20dc1c3e-b968-4c74-982e-bccffcdfa318/volumes" Mar 17 01:31:25 crc kubenswrapper[4735]: I0317 01:31:25.096962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerStarted","Data":"553df59b71908666f1948ad4318139f8357569a2309c68d7f4338524a198dc9c"} Mar 17 01:31:26 crc kubenswrapper[4735]: I0317 01:31:26.108899 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerStarted","Data":"1e306c4e72197d9f1b335860b0ce3705791de61a805e1f993de5befd694f05d9"} Mar 17 01:31:26 crc kubenswrapper[4735]: I0317 01:31:26.109391 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerStarted","Data":"0771519bc97c2af4c1584f9db1abf20466f093c86d49456722c6143d02a4f631"} Mar 17 01:31:27 crc kubenswrapper[4735]: I0317 01:31:27.119226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerStarted","Data":"827e849fa95d2838689e75791a7ff8532ed169af250e8140bfdd8ebc0618335f"} Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.063995 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bb4vj"] Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.065330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.067355 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.067390 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wl4fd" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.067565 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.091545 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bb4vj"] Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.164823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.164948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qth7h\" (UniqueName: \"kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.164985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.167043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.173185 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerStarted","Data":"28e6f66a35e80f23358978da203f28930da0ec8202e70d3d7d4d7449890530f7"} Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.174108 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.206588 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.568707734 podStartE2EDuration="5.206570621s" podCreationTimestamp="2026-03-17 01:31:24 +0000 UTC" firstStartedPulling="2026-03-17 01:31:24.995435184 +0000 UTC m=+1310.627668172" lastFinishedPulling="2026-03-17 01:31:28.633298081 +0000 UTC m=+1314.265531059" observedRunningTime="2026-03-17 01:31:29.193979302 +0000 UTC m=+1314.826212280" watchObservedRunningTime="2026-03-17 01:31:29.206570621 +0000 UTC m=+1314.838803599" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.268364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qth7h\" (UniqueName: \"kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.268428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.268517 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.268558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.276523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.288665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.293985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qth7h\" (UniqueName: \"kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.301429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts\") pod \"nova-cell0-conductor-db-sync-bb4vj\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.380241 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:29 crc kubenswrapper[4735]: I0317 01:31:29.865143 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bb4vj"] Mar 17 01:31:30 crc kubenswrapper[4735]: I0317 01:31:30.192487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" event={"ID":"6275d08c-4457-4de4-aa66-98f5666568f5","Type":"ContainerStarted","Data":"ec19b1ce1e2878d24771eb8eab6461ff1f7a59884d87018d7961a755812b37e3"} Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.576365 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.577278 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-central-agent" containerID="cri-o://0771519bc97c2af4c1584f9db1abf20466f093c86d49456722c6143d02a4f631" gracePeriod=30 Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.577332 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-notification-agent" containerID="cri-o://1e306c4e72197d9f1b335860b0ce3705791de61a805e1f993de5befd694f05d9" gracePeriod=30 Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.577352 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="sg-core" containerID="cri-o://827e849fa95d2838689e75791a7ff8532ed169af250e8140bfdd8ebc0618335f" gracePeriod=30 Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.577368 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="proxy-httpd" containerID="cri-o://28e6f66a35e80f23358978da203f28930da0ec8202e70d3d7d4d7449890530f7" gracePeriod=30 Mar 17 01:31:38 crc kubenswrapper[4735]: I0317 01:31:38.601750 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.204:3000/\": EOF" Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.278949 4735 generic.go:334] "Generic (PLEG): container finished" podID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerID="28e6f66a35e80f23358978da203f28930da0ec8202e70d3d7d4d7449890530f7" exitCode=0 Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.278977 4735 generic.go:334] "Generic (PLEG): container finished" podID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerID="827e849fa95d2838689e75791a7ff8532ed169af250e8140bfdd8ebc0618335f" exitCode=2 Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.278986 4735 generic.go:334] "Generic (PLEG): container finished" podID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerID="1e306c4e72197d9f1b335860b0ce3705791de61a805e1f993de5befd694f05d9" exitCode=0 Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.278992 4735 generic.go:334] "Generic (PLEG): container finished" podID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerID="0771519bc97c2af4c1584f9db1abf20466f093c86d49456722c6143d02a4f631" exitCode=0 Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.279010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerDied","Data":"28e6f66a35e80f23358978da203f28930da0ec8202e70d3d7d4d7449890530f7"} Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.279453 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerDied","Data":"827e849fa95d2838689e75791a7ff8532ed169af250e8140bfdd8ebc0618335f"} Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.279468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerDied","Data":"1e306c4e72197d9f1b335860b0ce3705791de61a805e1f993de5befd694f05d9"} Mar 17 01:31:39 crc kubenswrapper[4735]: I0317 01:31:39.279476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerDied","Data":"0771519bc97c2af4c1584f9db1abf20466f093c86d49456722c6143d02a4f631"} Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.758770 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903103 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903641 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gjbr\" (UniqueName: \"kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903916 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.903978 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data\") pod \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\" (UID: \"d16ac34c-9ffc-45d4-bd3e-dbb315763700\") " Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.904343 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.904510 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.913309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr" (OuterVolumeSpecName: "kube-api-access-4gjbr") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "kube-api-access-4gjbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.922114 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts" (OuterVolumeSpecName: "scripts") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.944550 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:40 crc kubenswrapper[4735]: I0317 01:31:40.994242 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006499 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006525 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006534 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006544 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006552 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gjbr\" (UniqueName: \"kubernetes.io/projected/d16ac34c-9ffc-45d4-bd3e-dbb315763700-kube-api-access-4gjbr\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.006560 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d16ac34c-9ffc-45d4-bd3e-dbb315763700-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.010978 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data" (OuterVolumeSpecName: "config-data") pod "d16ac34c-9ffc-45d4-bd3e-dbb315763700" (UID: "d16ac34c-9ffc-45d4-bd3e-dbb315763700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.107671 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16ac34c-9ffc-45d4-bd3e-dbb315763700-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.297838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d16ac34c-9ffc-45d4-bd3e-dbb315763700","Type":"ContainerDied","Data":"553df59b71908666f1948ad4318139f8357569a2309c68d7f4338524a198dc9c"} Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.298214 4735 scope.go:117] "RemoveContainer" containerID="28e6f66a35e80f23358978da203f28930da0ec8202e70d3d7d4d7449890530f7" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.298107 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.299684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" event={"ID":"6275d08c-4457-4de4-aa66-98f5666568f5","Type":"ContainerStarted","Data":"cf2442c3c95bbc6e66efc775ac7000bd8eb6691858efefbbbbe2e3ab5a20d394"} Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.317709 4735 scope.go:117] "RemoveContainer" containerID="827e849fa95d2838689e75791a7ff8532ed169af250e8140bfdd8ebc0618335f" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.325475 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" podStartSLOduration=1.6948206209999999 podStartE2EDuration="12.32545977s" podCreationTimestamp="2026-03-17 01:31:29 +0000 UTC" firstStartedPulling="2026-03-17 01:31:29.872092481 +0000 UTC m=+1315.504325459" lastFinishedPulling="2026-03-17 01:31:40.50273163 +0000 UTC m=+1326.134964608" observedRunningTime="2026-03-17 01:31:41.322946718 +0000 UTC m=+1326.955179696" watchObservedRunningTime="2026-03-17 01:31:41.32545977 +0000 UTC m=+1326.957692748" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.337307 4735 scope.go:117] "RemoveContainer" containerID="1e306c4e72197d9f1b335860b0ce3705791de61a805e1f993de5befd694f05d9" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.349492 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.361429 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.368034 4735 scope.go:117] "RemoveContainer" containerID="0771519bc97c2af4c1584f9db1abf20466f093c86d49456722c6143d02a4f631" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.379709 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:41 crc kubenswrapper[4735]: E0317 01:31:41.380130 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="proxy-httpd" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380154 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="proxy-httpd" Mar 17 01:31:41 crc kubenswrapper[4735]: E0317 01:31:41.380184 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-central-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380194 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-central-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: E0317 01:31:41.380223 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-notification-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380231 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-notification-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: E0317 01:31:41.380249 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="sg-core" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380256 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="sg-core" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380444 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="sg-core" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380473 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-notification-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380490 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="ceilometer-central-agent" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.380501 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" containerName="proxy-httpd" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.390678 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.401980 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.406673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.410195 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515383 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515473 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515558 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lkzj\" (UniqueName: \"kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515580 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.515668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lkzj\" (UniqueName: \"kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617324 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.617380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.618539 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.618567 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.624192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.627417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.629411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.638739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.647579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lkzj\" (UniqueName: \"kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj\") pod \"ceilometer-0\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " pod="openstack/ceilometer-0" Mar 17 01:31:41 crc kubenswrapper[4735]: I0317 01:31:41.740841 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:42 crc kubenswrapper[4735]: I0317 01:31:42.239259 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:42 crc kubenswrapper[4735]: I0317 01:31:42.310884 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerStarted","Data":"b086f112b948f0f324e710aabb08d387f3474d7543414a05b836fefbbddba394"} Mar 17 01:31:43 crc kubenswrapper[4735]: I0317 01:31:43.084747 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16ac34c-9ffc-45d4-bd3e-dbb315763700" path="/var/lib/kubelet/pods/d16ac34c-9ffc-45d4-bd3e-dbb315763700/volumes" Mar 17 01:31:43 crc kubenswrapper[4735]: I0317 01:31:43.320734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerStarted","Data":"969e8e35c4d4d7d0903f9275236535e93cb0682a9c9145fb938f6ddf9fe1d9da"} Mar 17 01:31:43 crc kubenswrapper[4735]: I0317 01:31:43.321609 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerStarted","Data":"438338f4f7328b5d0481aab2c83681736370429fda12290a8a23d5bc829c722d"} Mar 17 01:31:44 crc kubenswrapper[4735]: I0317 01:31:44.331032 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerStarted","Data":"565fd3e372474697cdde8bd5afa29c83ac6cc628150d4e789f7df59ac3d9d045"} Mar 17 01:31:46 crc kubenswrapper[4735]: I0317 01:31:46.359434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerStarted","Data":"d6d068d82c4ea529054907db20a118ce79c005a5b5118c44e48be97808a7c2f2"} Mar 17 01:31:46 crc kubenswrapper[4735]: I0317 01:31:46.359891 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:31:46 crc kubenswrapper[4735]: I0317 01:31:46.382992 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.327034412 podStartE2EDuration="5.382972905s" podCreationTimestamp="2026-03-17 01:31:41 +0000 UTC" firstStartedPulling="2026-03-17 01:31:42.245237947 +0000 UTC m=+1327.877470925" lastFinishedPulling="2026-03-17 01:31:45.30117644 +0000 UTC m=+1330.933409418" observedRunningTime="2026-03-17 01:31:46.375357709 +0000 UTC m=+1332.007590687" watchObservedRunningTime="2026-03-17 01:31:46.382972905 +0000 UTC m=+1332.015205883" Mar 17 01:31:47 crc kubenswrapper[4735]: I0317 01:31:47.691636 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-86957cdc-x94sr" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.186:8004/healthcheck\": dial tcp 10.217.0.186:8004: connect: connection refused" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.281544 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.355589 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom\") pod \"c50099a5-67b0-4c0b-be11-146f30190beb\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.355660 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle\") pod \"c50099a5-67b0-4c0b-be11-146f30190beb\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.355708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84nxj\" (UniqueName: \"kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj\") pod \"c50099a5-67b0-4c0b-be11-146f30190beb\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.355789 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data\") pod \"c50099a5-67b0-4c0b-be11-146f30190beb\" (UID: \"c50099a5-67b0-4c0b-be11-146f30190beb\") " Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.364944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c50099a5-67b0-4c0b-be11-146f30190beb" (UID: "c50099a5-67b0-4c0b-be11-146f30190beb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.379266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj" (OuterVolumeSpecName: "kube-api-access-84nxj") pod "c50099a5-67b0-4c0b-be11-146f30190beb" (UID: "c50099a5-67b0-4c0b-be11-146f30190beb"). InnerVolumeSpecName "kube-api-access-84nxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.387376 4735 generic.go:334] "Generic (PLEG): container finished" podID="c50099a5-67b0-4c0b-be11-146f30190beb" containerID="d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23" exitCode=137 Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.387427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86957cdc-x94sr" event={"ID":"c50099a5-67b0-4c0b-be11-146f30190beb","Type":"ContainerDied","Data":"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23"} Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.387452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86957cdc-x94sr" event={"ID":"c50099a5-67b0-4c0b-be11-146f30190beb","Type":"ContainerDied","Data":"5afb2720042c68df7313b2a94e9681761d4cd7a4a87610bbe852676cf3df3fab"} Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.387473 4735 scope.go:117] "RemoveContainer" containerID="d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.387579 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86957cdc-x94sr" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.404049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c50099a5-67b0-4c0b-be11-146f30190beb" (UID: "c50099a5-67b0-4c0b-be11-146f30190beb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.435010 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data" (OuterVolumeSpecName: "config-data") pod "c50099a5-67b0-4c0b-be11-146f30190beb" (UID: "c50099a5-67b0-4c0b-be11-146f30190beb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.458554 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.458583 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.458595 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84nxj\" (UniqueName: \"kubernetes.io/projected/c50099a5-67b0-4c0b-be11-146f30190beb-kube-api-access-84nxj\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.458605 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50099a5-67b0-4c0b-be11-146f30190beb-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.484230 4735 scope.go:117] "RemoveContainer" containerID="d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23" Mar 17 01:31:48 crc kubenswrapper[4735]: E0317 01:31:48.484972 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23\": container with ID starting with d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23 not found: ID does not exist" containerID="d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.485013 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23"} err="failed to get container status \"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23\": rpc error: code = NotFound desc = could not find container \"d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23\": container with ID starting with d9459d9d1207c362129019d6271468201e636a1a5a7ba682c81bbb38e6d24f23 not found: ID does not exist" Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.723688 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.734368 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-86957cdc-x94sr"] Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.875187 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.876027 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-central-agent" containerID="cri-o://438338f4f7328b5d0481aab2c83681736370429fda12290a8a23d5bc829c722d" gracePeriod=30 Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.876151 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="sg-core" containerID="cri-o://565fd3e372474697cdde8bd5afa29c83ac6cc628150d4e789f7df59ac3d9d045" gracePeriod=30 Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.876167 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-notification-agent" containerID="cri-o://969e8e35c4d4d7d0903f9275236535e93cb0682a9c9145fb938f6ddf9fe1d9da" gracePeriod=30 Mar 17 01:31:48 crc kubenswrapper[4735]: I0317 01:31:48.876212 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="proxy-httpd" containerID="cri-o://d6d068d82c4ea529054907db20a118ce79c005a5b5118c44e48be97808a7c2f2" gracePeriod=30 Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.082313 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" path="/var/lib/kubelet/pods/c50099a5-67b0-4c0b-be11-146f30190beb/volumes" Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400238 4735 generic.go:334] "Generic (PLEG): container finished" podID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerID="d6d068d82c4ea529054907db20a118ce79c005a5b5118c44e48be97808a7c2f2" exitCode=0 Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400273 4735 generic.go:334] "Generic (PLEG): container finished" podID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerID="565fd3e372474697cdde8bd5afa29c83ac6cc628150d4e789f7df59ac3d9d045" exitCode=2 Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400281 4735 generic.go:334] "Generic (PLEG): container finished" podID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerID="969e8e35c4d4d7d0903f9275236535e93cb0682a9c9145fb938f6ddf9fe1d9da" exitCode=0 Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerDied","Data":"d6d068d82c4ea529054907db20a118ce79c005a5b5118c44e48be97808a7c2f2"} Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerDied","Data":"565fd3e372474697cdde8bd5afa29c83ac6cc628150d4e789f7df59ac3d9d045"} Mar 17 01:31:49 crc kubenswrapper[4735]: I0317 01:31:49.400354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerDied","Data":"969e8e35c4d4d7d0903f9275236535e93cb0682a9c9145fb938f6ddf9fe1d9da"} Mar 17 01:31:54 crc kubenswrapper[4735]: I0317 01:31:54.452010 4735 generic.go:334] "Generic (PLEG): container finished" podID="6275d08c-4457-4de4-aa66-98f5666568f5" containerID="cf2442c3c95bbc6e66efc775ac7000bd8eb6691858efefbbbbe2e3ab5a20d394" exitCode=0 Mar 17 01:31:54 crc kubenswrapper[4735]: I0317 01:31:54.452097 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" event={"ID":"6275d08c-4457-4de4-aa66-98f5666568f5","Type":"ContainerDied","Data":"cf2442c3c95bbc6e66efc775ac7000bd8eb6691858efefbbbbe2e3ab5a20d394"} Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.472435 4735 generic.go:334] "Generic (PLEG): container finished" podID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerID="438338f4f7328b5d0481aab2c83681736370429fda12290a8a23d5bc829c722d" exitCode=0 Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.473109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerDied","Data":"438338f4f7328b5d0481aab2c83681736370429fda12290a8a23d5bc829c722d"} Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.724752 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.807647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.807957 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808024 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808090 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808217 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lkzj\" (UniqueName: \"kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj\") pod \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\" (UID: \"41e1fcd6-228d-422a-bf3a-ecb839720b4e\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.808772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.809549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.821610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts" (OuterVolumeSpecName: "scripts") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.823258 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj" (OuterVolumeSpecName: "kube-api-access-8lkzj") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "kube-api-access-8lkzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.824636 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.863850 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.909823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qth7h\" (UniqueName: \"kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h\") pod \"6275d08c-4457-4de4-aa66-98f5666568f5\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.909911 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts\") pod \"6275d08c-4457-4de4-aa66-98f5666568f5\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.909946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle\") pod \"6275d08c-4457-4de4-aa66-98f5666568f5\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.909989 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data\") pod \"6275d08c-4457-4de4-aa66-98f5666568f5\" (UID: \"6275d08c-4457-4de4-aa66-98f5666568f5\") " Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.910432 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lkzj\" (UniqueName: \"kubernetes.io/projected/41e1fcd6-228d-422a-bf3a-ecb839720b4e-kube-api-access-8lkzj\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.910450 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.910459 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.910468 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41e1fcd6-228d-422a-bf3a-ecb839720b4e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.910476 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.920292 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts" (OuterVolumeSpecName: "scripts") pod "6275d08c-4457-4de4-aa66-98f5666568f5" (UID: "6275d08c-4457-4de4-aa66-98f5666568f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.922171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.928087 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h" (OuterVolumeSpecName: "kube-api-access-qth7h") pod "6275d08c-4457-4de4-aa66-98f5666568f5" (UID: "6275d08c-4457-4de4-aa66-98f5666568f5"). InnerVolumeSpecName "kube-api-access-qth7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.936247 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6275d08c-4457-4de4-aa66-98f5666568f5" (UID: "6275d08c-4457-4de4-aa66-98f5666568f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.938371 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data" (OuterVolumeSpecName: "config-data") pod "6275d08c-4457-4de4-aa66-98f5666568f5" (UID: "6275d08c-4457-4de4-aa66-98f5666568f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:55 crc kubenswrapper[4735]: I0317 01:31:55.949892 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data" (OuterVolumeSpecName: "config-data") pod "41e1fcd6-228d-422a-bf3a-ecb839720b4e" (UID: "41e1fcd6-228d-422a-bf3a-ecb839720b4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013036 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013089 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e1fcd6-228d-422a-bf3a-ecb839720b4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013115 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qth7h\" (UniqueName: \"kubernetes.io/projected/6275d08c-4457-4de4-aa66-98f5666568f5-kube-api-access-qth7h\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013134 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013153 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.013173 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275d08c-4457-4de4-aa66-98f5666568f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.486577 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.486565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41e1fcd6-228d-422a-bf3a-ecb839720b4e","Type":"ContainerDied","Data":"b086f112b948f0f324e710aabb08d387f3474d7543414a05b836fefbbddba394"} Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.490136 4735 scope.go:117] "RemoveContainer" containerID="d6d068d82c4ea529054907db20a118ce79c005a5b5118c44e48be97808a7c2f2" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.498176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" event={"ID":"6275d08c-4457-4de4-aa66-98f5666568f5","Type":"ContainerDied","Data":"ec19b1ce1e2878d24771eb8eab6461ff1f7a59884d87018d7961a755812b37e3"} Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.498218 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec19b1ce1e2878d24771eb8eab6461ff1f7a59884d87018d7961a755812b37e3" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.498688 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bb4vj" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.522700 4735 scope.go:117] "RemoveContainer" containerID="565fd3e372474697cdde8bd5afa29c83ac6cc628150d4e789f7df59ac3d9d045" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.557470 4735 scope.go:117] "RemoveContainer" containerID="969e8e35c4d4d7d0903f9275236535e93cb0682a9c9145fb938f6ddf9fe1d9da" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.561505 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.571073 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.586973 4735 scope.go:117] "RemoveContainer" containerID="438338f4f7328b5d0481aab2c83681736370429fda12290a8a23d5bc829c722d" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593143 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593527 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="sg-core" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593544 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="sg-core" Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593566 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6275d08c-4457-4de4-aa66-98f5666568f5" containerName="nova-cell0-conductor-db-sync" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593574 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6275d08c-4457-4de4-aa66-98f5666568f5" containerName="nova-cell0-conductor-db-sync" Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593590 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="proxy-httpd" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593596 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="proxy-httpd" Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593607 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-notification-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593612 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-notification-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593621 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-central-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593626 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-central-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: E0317 01:31:56.593661 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" containerName="heat-api" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593667 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" containerName="heat-api" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593824 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6275d08c-4457-4de4-aa66-98f5666568f5" containerName="nova-cell0-conductor-db-sync" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593839 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50099a5-67b0-4c0b-be11-146f30190beb" containerName="heat-api" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593867 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="proxy-httpd" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593875 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="sg-core" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593882 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-central-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.593889 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" containerName="ceilometer-notification-agent" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.595432 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.602368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.602576 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.617520 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.671422 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.672463 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.674642 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.675174 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wl4fd" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.704826 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731068 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731215 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gng\" (UniqueName: \"kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731293 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbtj\" (UniqueName: \"kubernetes.io/projected/8aaa374c-be86-4c12-81f6-cef430c7a160-kube-api-access-fcbtj\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731362 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.731510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832696 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832736 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gng\" (UniqueName: \"kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832786 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbtj\" (UniqueName: \"kubernetes.io/projected/8aaa374c-be86-4c12-81f6-cef430c7a160-kube-api-access-fcbtj\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.832892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.833681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.834266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.838306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.838406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.839419 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.839746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aaa374c-be86-4c12-81f6-cef430c7a160-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.839749 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.840552 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.849988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gng\" (UniqueName: \"kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng\") pod \"ceilometer-0\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.855461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbtj\" (UniqueName: \"kubernetes.io/projected/8aaa374c-be86-4c12-81f6-cef430c7a160-kube-api-access-fcbtj\") pod \"nova-cell0-conductor-0\" (UID: \"8aaa374c-be86-4c12-81f6-cef430c7a160\") " pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.923417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:31:56 crc kubenswrapper[4735]: I0317 01:31:56.986144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:57 crc kubenswrapper[4735]: I0317 01:31:57.092737 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e1fcd6-228d-422a-bf3a-ecb839720b4e" path="/var/lib/kubelet/pods/41e1fcd6-228d-422a-bf3a-ecb839720b4e/volumes" Mar 17 01:31:57 crc kubenswrapper[4735]: I0317 01:31:57.408984 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:31:57 crc kubenswrapper[4735]: I0317 01:31:57.490325 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 01:31:57 crc kubenswrapper[4735]: W0317 01:31:57.498675 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aaa374c_be86_4c12_81f6_cef430c7a160.slice/crio-95c4d282dbb953a4c920058596c91573417f35e0f3c1244dcb95fb3b2ef31681 WatchSource:0}: Error finding container 95c4d282dbb953a4c920058596c91573417f35e0f3c1244dcb95fb3b2ef31681: Status 404 returned error can't find the container with id 95c4d282dbb953a4c920058596c91573417f35e0f3c1244dcb95fb3b2ef31681 Mar 17 01:31:57 crc kubenswrapper[4735]: I0317 01:31:57.511689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerStarted","Data":"a8417a613047f43836027b3cd572af5f3b8630b1048535b65b17f1278854f6f3"} Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.526476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerStarted","Data":"91faf749add6a123dabdccc8d3cf17c7b85b229fa2e2ca1be83a410028ef93c7"} Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.526943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerStarted","Data":"cccd3a6760b6702e9d69c6188245bf360640cbe3c4dfc4f68224e1e2841ef165"} Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.529423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8aaa374c-be86-4c12-81f6-cef430c7a160","Type":"ContainerStarted","Data":"897f477deaff0a2297ff31adf51eb6ee7b302a59e622d3b28cd4a04325027faf"} Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.529561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8aaa374c-be86-4c12-81f6-cef430c7a160","Type":"ContainerStarted","Data":"95c4d282dbb953a4c920058596c91573417f35e0f3c1244dcb95fb3b2ef31681"} Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.529722 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 17 01:31:58 crc kubenswrapper[4735]: I0317 01:31:58.556829 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.556802621 podStartE2EDuration="2.556802621s" podCreationTimestamp="2026-03-17 01:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:31:58.553952551 +0000 UTC m=+1344.186185529" watchObservedRunningTime="2026-03-17 01:31:58.556802621 +0000 UTC m=+1344.189035609" Mar 17 01:31:59 crc kubenswrapper[4735]: I0317 01:31:59.540211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerStarted","Data":"6aafdd5cb311e4f1a4c682844f51021f4ac983a71a9add6b070c6e3b5040923a"} Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.150765 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561852-2pfnc"] Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.152092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.154375 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.155621 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.155999 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.160507 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-2pfnc"] Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.202088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z889f\" (UniqueName: \"kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f\") pod \"auto-csr-approver-29561852-2pfnc\" (UID: \"77522492-88e3-46d7-a525-badec22c6c4b\") " pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.304503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z889f\" (UniqueName: \"kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f\") pod \"auto-csr-approver-29561852-2pfnc\" (UID: \"77522492-88e3-46d7-a525-badec22c6c4b\") " pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.327758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z889f\" (UniqueName: \"kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f\") pod \"auto-csr-approver-29561852-2pfnc\" (UID: \"77522492-88e3-46d7-a525-badec22c6c4b\") " pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.467099 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:00 crc kubenswrapper[4735]: I0317 01:32:00.944766 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-2pfnc"] Mar 17 01:32:01 crc kubenswrapper[4735]: I0317 01:32:01.561562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" event={"ID":"77522492-88e3-46d7-a525-badec22c6c4b","Type":"ContainerStarted","Data":"6d002228048c0cc98fab4d915fcbc1466e1b28fec10f7f782594f04e99e12a7c"} Mar 17 01:32:01 crc kubenswrapper[4735]: I0317 01:32:01.564575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerStarted","Data":"a2758603db03a8c385747f6d3a18cc38c2694322e5dd0bffa04f1c7a165bafa3"} Mar 17 01:32:01 crc kubenswrapper[4735]: I0317 01:32:01.565741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:32:01 crc kubenswrapper[4735]: I0317 01:32:01.595767 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.469657025 podStartE2EDuration="5.595742178s" podCreationTimestamp="2026-03-17 01:31:56 +0000 UTC" firstStartedPulling="2026-03-17 01:31:57.420276555 +0000 UTC m=+1343.052509533" lastFinishedPulling="2026-03-17 01:32:00.546361708 +0000 UTC m=+1346.178594686" observedRunningTime="2026-03-17 01:32:01.58968967 +0000 UTC m=+1347.221922648" watchObservedRunningTime="2026-03-17 01:32:01.595742178 +0000 UTC m=+1347.227975186" Mar 17 01:32:02 crc kubenswrapper[4735]: I0317 01:32:02.578827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" event={"ID":"77522492-88e3-46d7-a525-badec22c6c4b","Type":"ContainerStarted","Data":"39fb346fce05fccbaaa9eb90bae49c62db4b24855650346e89adf8ea5154ad2b"} Mar 17 01:32:02 crc kubenswrapper[4735]: I0317 01:32:02.597582 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" podStartSLOduration=1.631141785 podStartE2EDuration="2.597562335s" podCreationTimestamp="2026-03-17 01:32:00 +0000 UTC" firstStartedPulling="2026-03-17 01:32:00.951971111 +0000 UTC m=+1346.584204089" lastFinishedPulling="2026-03-17 01:32:01.918391661 +0000 UTC m=+1347.550624639" observedRunningTime="2026-03-17 01:32:02.592092091 +0000 UTC m=+1348.224325069" watchObservedRunningTime="2026-03-17 01:32:02.597562335 +0000 UTC m=+1348.229795323" Mar 17 01:32:03 crc kubenswrapper[4735]: I0317 01:32:03.595657 4735 generic.go:334] "Generic (PLEG): container finished" podID="77522492-88e3-46d7-a525-badec22c6c4b" containerID="39fb346fce05fccbaaa9eb90bae49c62db4b24855650346e89adf8ea5154ad2b" exitCode=0 Mar 17 01:32:03 crc kubenswrapper[4735]: I0317 01:32:03.597132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" event={"ID":"77522492-88e3-46d7-a525-badec22c6c4b","Type":"ContainerDied","Data":"39fb346fce05fccbaaa9eb90bae49c62db4b24855650346e89adf8ea5154ad2b"} Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.085641 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.227853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z889f\" (UniqueName: \"kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f\") pod \"77522492-88e3-46d7-a525-badec22c6c4b\" (UID: \"77522492-88e3-46d7-a525-badec22c6c4b\") " Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.242531 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f" (OuterVolumeSpecName: "kube-api-access-z889f") pod "77522492-88e3-46d7-a525-badec22c6c4b" (UID: "77522492-88e3-46d7-a525-badec22c6c4b"). InnerVolumeSpecName "kube-api-access-z889f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.330609 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z889f\" (UniqueName: \"kubernetes.io/projected/77522492-88e3-46d7-a525-badec22c6c4b-kube-api-access-z889f\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.626063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" event={"ID":"77522492-88e3-46d7-a525-badec22c6c4b","Type":"ContainerDied","Data":"6d002228048c0cc98fab4d915fcbc1466e1b28fec10f7f782594f04e99e12a7c"} Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.626402 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d002228048c0cc98fab4d915fcbc1466e1b28fec10f7f782594f04e99e12a7c" Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.626133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-2pfnc" Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.704158 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-4jp52"] Mar 17 01:32:05 crc kubenswrapper[4735]: I0317 01:32:05.716231 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-4jp52"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.036741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.095582 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890c33ed-65c3-4f76-9da9-99b3f3e8ef33" path="/var/lib/kubelet/pods/890c33ed-65c3-4f76-9da9-99b3f3e8ef33/volumes" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.588075 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-q4lh2"] Mar 17 01:32:07 crc kubenswrapper[4735]: E0317 01:32:07.588974 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77522492-88e3-46d7-a525-badec22c6c4b" containerName="oc" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.589131 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77522492-88e3-46d7-a525-badec22c6c4b" containerName="oc" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.589533 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77522492-88e3-46d7-a525-badec22c6c4b" containerName="oc" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.590604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.599200 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.607802 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q4lh2"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.611490 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.679244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.679296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.679334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.679429 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4cqv\" (UniqueName: \"kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.778082 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.779682 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.781348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4cqv\" (UniqueName: \"kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.781432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.781462 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.781485 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.793937 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.801021 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.812676 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.816475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.838442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.869566 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4cqv\" (UniqueName: \"kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv\") pod \"nova-cell0-cell-mapping-q4lh2\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.884327 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.884388 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.884418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.884443 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p8pw\" (UniqueName: \"kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.906259 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.909494 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.910594 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.915578 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.983105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.986903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.986954 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.986994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.987020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.987046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p8pw\" (UniqueName: \"kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.987104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5zm\" (UniqueName: \"kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.987139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:07 crc kubenswrapper[4735]: I0317 01:32:07.987324 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.021036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.030552 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p8pw\" (UniqueName: \"kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.030762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data\") pod \"nova-api-0\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " pod="openstack/nova-api-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.091985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.092149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5zm\" (UniqueName: \"kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.092229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.101627 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.102152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.127972 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.130193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.135054 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.151411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5zm\" (UniqueName: \"kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm\") pod \"nova-scheduler-0\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.194587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.194719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.194756 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmgl\" (UniqueName: \"kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.194800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.232009 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.261583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.284179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.347701 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.366614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.366782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.366880 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmgl\" (UniqueName: \"kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.366984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.373969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.380964 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.391277 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.400413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.412791 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.415304 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmgl\" (UniqueName: \"kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl\") pod \"nova-metadata-0\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.415427 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.455248 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.456735 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.469461 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phhjq\" (UniqueName: \"kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.469513 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.469569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.474183 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.491871 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phhjq\" (UniqueName: \"kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fgjm\" (UniqueName: \"kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.573467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.579424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.583706 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.597125 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phhjq\" (UniqueName: \"kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fgjm\" (UniqueName: \"kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675200 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.675331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.676418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.676823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.676956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.677792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.678242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.694657 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fgjm\" (UniqueName: \"kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm\") pod \"dnsmasq-dns-546765fdff-hb4kb\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.734064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.753458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q4lh2"] Mar 17 01:32:08 crc kubenswrapper[4735]: I0317 01:32:08.795000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.034427 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.048591 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.260607 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.586164 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.702340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerStarted","Data":"28884e83c2544eb7020785e4bdd1f7a668f29c142739ca0bb9ed41e9b04ce6b2"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.707550 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.713710 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q4lh2" event={"ID":"9384926e-1432-4a7d-9a9a-10b05219cc9e","Type":"ContainerStarted","Data":"45d87ecc4d8420b78d004cbf9ec049f64406af1fa1707f26bf2eb1c1d1ed3cac"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.713751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q4lh2" event={"ID":"9384926e-1432-4a7d-9a9a-10b05219cc9e","Type":"ContainerStarted","Data":"a2e6e7570d7576fa520e1aac0728a678ce7a64ae4326f3aca5c6a3d01d973d73"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.732118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a","Type":"ContainerStarted","Data":"b1ac5f3167d308d4a55734ff6aa38073fa7493acee4049c3209c48f31fcb0ce0"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.734022 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j9k5n"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.735507 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.740342 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.740540 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.743655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b","Type":"ContainerStarted","Data":"e792df8ab1bc0cfc821088c0ca7cd148c9603b5462aa9c18b7b7527b1a6eea54"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.749701 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerStarted","Data":"26e003fb263eedfadd12bd2c5425f5aee47a5c9f5bb195c55e79ac71d3707212"} Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.752780 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j9k5n"] Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.761457 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-q4lh2" podStartSLOduration=2.761435758 podStartE2EDuration="2.761435758s" podCreationTimestamp="2026-03-17 01:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:09.748251485 +0000 UTC m=+1355.380484453" watchObservedRunningTime="2026-03-17 01:32:09.761435758 +0000 UTC m=+1355.393668736" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.925550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8zb\" (UniqueName: \"kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.925608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.925644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:09 crc kubenswrapper[4735]: I0317 01:32:09.925815 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.030782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.030889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.030995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8zb\" (UniqueName: \"kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.031034 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.046330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.055460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.066565 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.081480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8zb\" (UniqueName: \"kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb\") pod \"nova-cell1-conductor-db-sync-j9k5n\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.361064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.792202 4735 generic.go:334] "Generic (PLEG): container finished" podID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerID="7a1a644397a3c4f906dec8527d161cc291be57a07d0bee4477dbceecc56fcccf" exitCode=0 Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.793011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" event={"ID":"c654c0cc-f663-4a38-9872-f4c10a4ee171","Type":"ContainerDied","Data":"7a1a644397a3c4f906dec8527d161cc291be57a07d0bee4477dbceecc56fcccf"} Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.793062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" event={"ID":"c654c0cc-f663-4a38-9872-f4c10a4ee171","Type":"ContainerStarted","Data":"fe299b728aa4a8c4222a53deeeb8a5a954dfec2e516d3a83ff0335bd93b0d4ed"} Mar 17 01:32:10 crc kubenswrapper[4735]: I0317 01:32:10.972092 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j9k5n"] Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.803371 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" event={"ID":"05381036-1334-4a2b-a3ce-c64331ba0ebb","Type":"ContainerStarted","Data":"5292ec403c041d4f4968f78f419f67d592c822306f045f6a00310150735ab213"} Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.803673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" event={"ID":"05381036-1334-4a2b-a3ce-c64331ba0ebb","Type":"ContainerStarted","Data":"f245d92308b03fa9b82dcbf53ceb7c38ebd9b22b4d92eadcc8b01b2648e713e9"} Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.812469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" event={"ID":"c654c0cc-f663-4a38-9872-f4c10a4ee171","Type":"ContainerStarted","Data":"0f90e5ff83cd5a8d9ab5b61040decb7d59fdead7810def6cd46c32936d87c062"} Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.812609 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.827354 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" podStartSLOduration=2.827340725 podStartE2EDuration="2.827340725s" podCreationTimestamp="2026-03-17 01:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:11.824426214 +0000 UTC m=+1357.456659192" watchObservedRunningTime="2026-03-17 01:32:11.827340725 +0000 UTC m=+1357.459573703" Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.870603 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" podStartSLOduration=3.870581104 podStartE2EDuration="3.870581104s" podCreationTimestamp="2026-03-17 01:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:11.864564356 +0000 UTC m=+1357.496797324" watchObservedRunningTime="2026-03-17 01:32:11.870581104 +0000 UTC m=+1357.502814082" Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.890889 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:11 crc kubenswrapper[4735]: I0317 01:32:11.955938 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:12 crc kubenswrapper[4735]: I0317 01:32:12.606480 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:32:12 crc kubenswrapper[4735]: I0317 01:32:12.606524 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.838478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a","Type":"ContainerStarted","Data":"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.839079 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16" gracePeriod=30 Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.845664 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b","Type":"ContainerStarted","Data":"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.849722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerStarted","Data":"dc48295c9226f3dbd7a5193f4fa2861088e17241c00116c75c9cfa76a649d6e2"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.849767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerStarted","Data":"477517ba213534ea34f43e871daa58088be01ac7fcdf4ecfc028909ccbdbd882"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.851536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerStarted","Data":"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.851560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerStarted","Data":"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b"} Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.851684 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-log" containerID="cri-o://fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" gracePeriod=30 Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.852003 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-metadata" containerID="cri-o://55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" gracePeriod=30 Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.892133 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.15144789 podStartE2EDuration="7.892117016s" podCreationTimestamp="2026-03-17 01:32:07 +0000 UTC" firstStartedPulling="2026-03-17 01:32:09.082414407 +0000 UTC m=+1354.714647385" lastFinishedPulling="2026-03-17 01:32:13.823083533 +0000 UTC m=+1359.455316511" observedRunningTime="2026-03-17 01:32:14.889485011 +0000 UTC m=+1360.521717989" watchObservedRunningTime="2026-03-17 01:32:14.892117016 +0000 UTC m=+1360.524349994" Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.895636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.694401859 podStartE2EDuration="6.895629062s" podCreationTimestamp="2026-03-17 01:32:08 +0000 UTC" firstStartedPulling="2026-03-17 01:32:09.620762383 +0000 UTC m=+1355.252995361" lastFinishedPulling="2026-03-17 01:32:13.821989586 +0000 UTC m=+1359.454222564" observedRunningTime="2026-03-17 01:32:14.86698023 +0000 UTC m=+1360.499213198" watchObservedRunningTime="2026-03-17 01:32:14.895629062 +0000 UTC m=+1360.527862040" Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.939563 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.210897246 podStartE2EDuration="7.939541578s" podCreationTimestamp="2026-03-17 01:32:07 +0000 UTC" firstStartedPulling="2026-03-17 01:32:09.09519292 +0000 UTC m=+1354.727425898" lastFinishedPulling="2026-03-17 01:32:13.823837252 +0000 UTC m=+1359.456070230" observedRunningTime="2026-03-17 01:32:14.915476147 +0000 UTC m=+1360.547709125" watchObservedRunningTime="2026-03-17 01:32:14.939541578 +0000 UTC m=+1360.571774556" Mar 17 01:32:14 crc kubenswrapper[4735]: I0317 01:32:14.951214 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.392832972 podStartE2EDuration="7.949964033s" podCreationTimestamp="2026-03-17 01:32:07 +0000 UTC" firstStartedPulling="2026-03-17 01:32:09.276925911 +0000 UTC m=+1354.909158889" lastFinishedPulling="2026-03-17 01:32:13.834056972 +0000 UTC m=+1359.466289950" observedRunningTime="2026-03-17 01:32:14.937930107 +0000 UTC m=+1360.570163085" watchObservedRunningTime="2026-03-17 01:32:14.949964033 +0000 UTC m=+1360.582197011" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.741443 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.865201 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs\") pod \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.865293 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle\") pod \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.865509 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data\") pod \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.865546 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmgl\" (UniqueName: \"kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl\") pod \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\" (UID: \"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9\") " Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.865701 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs" (OuterVolumeSpecName: "logs") pod "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" (UID: "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.866088 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.868896 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerID="55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" exitCode=0 Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869003 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerID="fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" exitCode=143 Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerDied","Data":"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380"} Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869517 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869539 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerDied","Data":"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b"} Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c67eb74-3c97-4f05-b450-f1a9a25f2bf9","Type":"ContainerDied","Data":"28884e83c2544eb7020785e4bdd1f7a668f29c142739ca0bb9ed41e9b04ce6b2"} Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.869576 4735 scope.go:117] "RemoveContainer" containerID="55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.896192 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl" (OuterVolumeSpecName: "kube-api-access-9wmgl") pod "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" (UID: "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9"). InnerVolumeSpecName "kube-api-access-9wmgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.909488 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" (UID: "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.911377 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data" (OuterVolumeSpecName: "config-data") pod "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" (UID: "0c67eb74-3c97-4f05-b450-f1a9a25f2bf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.934132 4735 scope.go:117] "RemoveContainer" containerID="fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.968823 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.968851 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmgl\" (UniqueName: \"kubernetes.io/projected/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-kube-api-access-9wmgl\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.968880 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.996038 4735 scope.go:117] "RemoveContainer" containerID="55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" Mar 17 01:32:15 crc kubenswrapper[4735]: E0317 01:32:15.996566 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380\": container with ID starting with 55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380 not found: ID does not exist" containerID="55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.996595 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380"} err="failed to get container status \"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380\": rpc error: code = NotFound desc = could not find container \"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380\": container with ID starting with 55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380 not found: ID does not exist" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.996615 4735 scope.go:117] "RemoveContainer" containerID="fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" Mar 17 01:32:15 crc kubenswrapper[4735]: E0317 01:32:15.997007 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b\": container with ID starting with fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b not found: ID does not exist" containerID="fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.997031 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b"} err="failed to get container status \"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b\": rpc error: code = NotFound desc = could not find container \"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b\": container with ID starting with fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b not found: ID does not exist" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.997045 4735 scope.go:117] "RemoveContainer" containerID="55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.997328 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380"} err="failed to get container status \"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380\": rpc error: code = NotFound desc = could not find container \"55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380\": container with ID starting with 55438e192e41388c6cd2d37881805c816899c250d1077efbd9af80faa512f380 not found: ID does not exist" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.997350 4735 scope.go:117] "RemoveContainer" containerID="fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b" Mar 17 01:32:15 crc kubenswrapper[4735]: I0317 01:32:15.997661 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b"} err="failed to get container status \"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b\": rpc error: code = NotFound desc = could not find container \"fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b\": container with ID starting with fc8b601f62064af772d2a38756d9fff49d89fc0ef31d0dd562d2e5acbdd9a17b not found: ID does not exist" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.212010 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.220312 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.236057 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:16 crc kubenswrapper[4735]: E0317 01:32:16.236475 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-log" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.236492 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-log" Mar 17 01:32:16 crc kubenswrapper[4735]: E0317 01:32:16.236526 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-metadata" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.236533 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-metadata" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.236694 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-metadata" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.236724 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" containerName="nova-metadata-log" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.237805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.242460 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.251144 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.279165 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.279216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.279238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzsmq\" (UniqueName: \"kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.279287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.279307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.294284 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381264 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzsmq\" (UniqueName: \"kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.381884 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.385889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.386452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.386781 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.399492 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzsmq\" (UniqueName: \"kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq\") pod \"nova-metadata-0\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " pod="openstack/nova-metadata-0" Mar 17 01:32:16 crc kubenswrapper[4735]: I0317 01:32:16.591262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.052662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.089533 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c67eb74-3c97-4f05-b450-f1a9a25f2bf9" path="/var/lib/kubelet/pods/0c67eb74-3c97-4f05-b450-f1a9a25f2bf9/volumes" Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.908675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerStarted","Data":"66dc2769d83a4bd530751f01287778e003b84d3375f2e57f1bcf1c42ba0a44bd"} Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.909410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerStarted","Data":"5456e77db7bf188423fcf0809e32e80d6204dba06a5e3755056552de64782144"} Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.909428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerStarted","Data":"52a098e48957189329fcd57ee5aed66e5f87dde26a6be370c463174650e24c7b"} Mar 17 01:32:17 crc kubenswrapper[4735]: I0317 01:32:17.949574 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9495559770000002 podStartE2EDuration="1.949555977s" podCreationTimestamp="2026-03-17 01:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:17.948704536 +0000 UTC m=+1363.580937514" watchObservedRunningTime="2026-03-17 01:32:17.949555977 +0000 UTC m=+1363.581788965" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.262888 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.262959 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.284725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.284840 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.320387 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 01:32:18 crc kubenswrapper[4735]: E0317 01:32:18.564363 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9384926e_1432_4a7d_9a9a_10b05219cc9e.slice/crio-conmon-45d87ecc4d8420b78d004cbf9ec049f64406af1fa1707f26bf2eb1c1d1ed3cac.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.735397 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.796034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.888422 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.888647 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="dnsmasq-dns" containerID="cri-o://512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de" gracePeriod=10 Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.923903 4735 generic.go:334] "Generic (PLEG): container finished" podID="9384926e-1432-4a7d-9a9a-10b05219cc9e" containerID="45d87ecc4d8420b78d004cbf9ec049f64406af1fa1707f26bf2eb1c1d1ed3cac" exitCode=0 Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.924788 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q4lh2" event={"ID":"9384926e-1432-4a7d-9a9a-10b05219cc9e","Type":"ContainerDied","Data":"45d87ecc4d8420b78d004cbf9ec049f64406af1fa1707f26bf2eb1c1d1ed3cac"} Mar 17 01:32:18 crc kubenswrapper[4735]: I0317 01:32:18.964835 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.347029 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.347053 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.427010 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.561735 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.561810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.561922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.561990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.562072 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.562090 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glbz7\" (UniqueName: \"kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7\") pod \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\" (UID: \"15817936-1648-4ce6-bfdd-c1cea98fe7e9\") " Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.598075 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7" (OuterVolumeSpecName: "kube-api-access-glbz7") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "kube-api-access-glbz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.633033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.638367 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config" (OuterVolumeSpecName: "config") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.648949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.653019 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.664467 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.664674 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.664770 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glbz7\" (UniqueName: \"kubernetes.io/projected/15817936-1648-4ce6-bfdd-c1cea98fe7e9-kube-api-access-glbz7\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.664902 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.664988 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.682552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "15817936-1648-4ce6-bfdd-c1cea98fe7e9" (UID: "15817936-1648-4ce6-bfdd-c1cea98fe7e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.767113 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15817936-1648-4ce6-bfdd-c1cea98fe7e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.933646 4735 generic.go:334] "Generic (PLEG): container finished" podID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerID="512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de" exitCode=0 Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.933773 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.934199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" event={"ID":"15817936-1648-4ce6-bfdd-c1cea98fe7e9","Type":"ContainerDied","Data":"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de"} Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.935557 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c7b9b57c-2t9zn" event={"ID":"15817936-1648-4ce6-bfdd-c1cea98fe7e9","Type":"ContainerDied","Data":"2fca2a2613f73abd56f97689015c3701ecb734859212f1d0dba3f16e1e6ff7da"} Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.935652 4735 scope.go:117] "RemoveContainer" containerID="512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.957788 4735 scope.go:117] "RemoveContainer" containerID="a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c" Mar 17 01:32:19 crc kubenswrapper[4735]: I0317 01:32:19.997434 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.019445 4735 scope.go:117] "RemoveContainer" containerID="512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.037257 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c7b9b57c-2t9zn"] Mar 17 01:32:20 crc kubenswrapper[4735]: E0317 01:32:20.059694 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de\": container with ID starting with 512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de not found: ID does not exist" containerID="512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.059735 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de"} err="failed to get container status \"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de\": rpc error: code = NotFound desc = could not find container \"512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de\": container with ID starting with 512caac1bd3a2668ea8c3aefbb869cca25c6af84949603f06fe9953de19641de not found: ID does not exist" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.059773 4735 scope.go:117] "RemoveContainer" containerID="a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c" Mar 17 01:32:20 crc kubenswrapper[4735]: E0317 01:32:20.060275 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c\": container with ID starting with a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c not found: ID does not exist" containerID="a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.060297 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c"} err="failed to get container status \"a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c\": rpc error: code = NotFound desc = could not find container \"a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c\": container with ID starting with a863f4a51d4f43689deed32b1e452a94f5e7092bac951004b605eb2f411a396c not found: ID does not exist" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.334697 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.381825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data\") pod \"9384926e-1432-4a7d-9a9a-10b05219cc9e\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.381975 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts\") pod \"9384926e-1432-4a7d-9a9a-10b05219cc9e\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.382012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle\") pod \"9384926e-1432-4a7d-9a9a-10b05219cc9e\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.382158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4cqv\" (UniqueName: \"kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv\") pod \"9384926e-1432-4a7d-9a9a-10b05219cc9e\" (UID: \"9384926e-1432-4a7d-9a9a-10b05219cc9e\") " Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.403234 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts" (OuterVolumeSpecName: "scripts") pod "9384926e-1432-4a7d-9a9a-10b05219cc9e" (UID: "9384926e-1432-4a7d-9a9a-10b05219cc9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.404062 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv" (OuterVolumeSpecName: "kube-api-access-h4cqv") pod "9384926e-1432-4a7d-9a9a-10b05219cc9e" (UID: "9384926e-1432-4a7d-9a9a-10b05219cc9e"). InnerVolumeSpecName "kube-api-access-h4cqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.427009 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data" (OuterVolumeSpecName: "config-data") pod "9384926e-1432-4a7d-9a9a-10b05219cc9e" (UID: "9384926e-1432-4a7d-9a9a-10b05219cc9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.460930 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9384926e-1432-4a7d-9a9a-10b05219cc9e" (UID: "9384926e-1432-4a7d-9a9a-10b05219cc9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.484353 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4cqv\" (UniqueName: \"kubernetes.io/projected/9384926e-1432-4a7d-9a9a-10b05219cc9e-kube-api-access-h4cqv\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.484389 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.484410 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.484421 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9384926e-1432-4a7d-9a9a-10b05219cc9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.943296 4735 generic.go:334] "Generic (PLEG): container finished" podID="05381036-1334-4a2b-a3ce-c64331ba0ebb" containerID="5292ec403c041d4f4968f78f419f67d592c822306f045f6a00310150735ab213" exitCode=0 Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.943359 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" event={"ID":"05381036-1334-4a2b-a3ce-c64331ba0ebb","Type":"ContainerDied","Data":"5292ec403c041d4f4968f78f419f67d592c822306f045f6a00310150735ab213"} Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.949081 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q4lh2" event={"ID":"9384926e-1432-4a7d-9a9a-10b05219cc9e","Type":"ContainerDied","Data":"a2e6e7570d7576fa520e1aac0728a678ce7a64ae4326f3aca5c6a3d01d973d73"} Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.949122 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2e6e7570d7576fa520e1aac0728a678ce7a64ae4326f3aca5c6a3d01d973d73" Mar 17 01:32:20 crc kubenswrapper[4735]: I0317 01:32:20.949175 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q4lh2" Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.082602 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" path="/var/lib/kubelet/pods/15817936-1648-4ce6-bfdd-c1cea98fe7e9/volumes" Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.140765 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.141169 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-log" containerID="cri-o://477517ba213534ea34f43e871daa58088be01ac7fcdf4ecfc028909ccbdbd882" gracePeriod=30 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.141223 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-api" containerID="cri-o://dc48295c9226f3dbd7a5193f4fa2861088e17241c00116c75c9cfa76a649d6e2" gracePeriod=30 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.164655 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.165138 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" containerName="nova-scheduler-scheduler" containerID="cri-o://53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e" gracePeriod=30 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.199416 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.199645 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-log" containerID="cri-o://5456e77db7bf188423fcf0809e32e80d6204dba06a5e3755056552de64782144" gracePeriod=30 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.199929 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-metadata" containerID="cri-o://66dc2769d83a4bd530751f01287778e003b84d3375f2e57f1bcf1c42ba0a44bd" gracePeriod=30 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.979414 4735 generic.go:334] "Generic (PLEG): container finished" podID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerID="477517ba213534ea34f43e871daa58088be01ac7fcdf4ecfc028909ccbdbd882" exitCode=143 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.979476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerDied","Data":"477517ba213534ea34f43e871daa58088be01ac7fcdf4ecfc028909ccbdbd882"} Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.996103 4735 generic.go:334] "Generic (PLEG): container finished" podID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerID="66dc2769d83a4bd530751f01287778e003b84d3375f2e57f1bcf1c42ba0a44bd" exitCode=0 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.996130 4735 generic.go:334] "Generic (PLEG): container finished" podID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerID="5456e77db7bf188423fcf0809e32e80d6204dba06a5e3755056552de64782144" exitCode=143 Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.996271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerDied","Data":"66dc2769d83a4bd530751f01287778e003b84d3375f2e57f1bcf1c42ba0a44bd"} Mar 17 01:32:21 crc kubenswrapper[4735]: I0317 01:32:21.996295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerDied","Data":"5456e77db7bf188423fcf0809e32e80d6204dba06a5e3755056552de64782144"} Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.261133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.327553 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle\") pod \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.327602 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data\") pod \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.327723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs\") pod \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.327782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzsmq\" (UniqueName: \"kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq\") pod \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.327823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs\") pod \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\" (UID: \"3cb1df1b-1b15-4cbf-a6f7-a834869872cf\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.333341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs" (OuterVolumeSpecName: "logs") pod "3cb1df1b-1b15-4cbf-a6f7-a834869872cf" (UID: "3cb1df1b-1b15-4cbf-a6f7-a834869872cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.337885 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq" (OuterVolumeSpecName: "kube-api-access-mzsmq") pod "3cb1df1b-1b15-4cbf-a6f7-a834869872cf" (UID: "3cb1df1b-1b15-4cbf-a6f7-a834869872cf"). InnerVolumeSpecName "kube-api-access-mzsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.380428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data" (OuterVolumeSpecName: "config-data") pod "3cb1df1b-1b15-4cbf-a6f7-a834869872cf" (UID: "3cb1df1b-1b15-4cbf-a6f7-a834869872cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.407347 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cb1df1b-1b15-4cbf-a6f7-a834869872cf" (UID: "3cb1df1b-1b15-4cbf-a6f7-a834869872cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.422390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3cb1df1b-1b15-4cbf-a6f7-a834869872cf" (UID: "3cb1df1b-1b15-4cbf-a6f7-a834869872cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.430294 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.430426 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzsmq\" (UniqueName: \"kubernetes.io/projected/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-kube-api-access-mzsmq\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.430594 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.430683 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.430764 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb1df1b-1b15-4cbf-a6f7-a834869872cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.532176 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.636652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts\") pod \"05381036-1334-4a2b-a3ce-c64331ba0ebb\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.637013 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data\") pod \"05381036-1334-4a2b-a3ce-c64331ba0ebb\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.637070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle\") pod \"05381036-1334-4a2b-a3ce-c64331ba0ebb\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.637439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8zb\" (UniqueName: \"kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb\") pod \"05381036-1334-4a2b-a3ce-c64331ba0ebb\" (UID: \"05381036-1334-4a2b-a3ce-c64331ba0ebb\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.645341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb" (OuterVolumeSpecName: "kube-api-access-pq8zb") pod "05381036-1334-4a2b-a3ce-c64331ba0ebb" (UID: "05381036-1334-4a2b-a3ce-c64331ba0ebb"). InnerVolumeSpecName "kube-api-access-pq8zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.645875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts" (OuterVolumeSpecName: "scripts") pod "05381036-1334-4a2b-a3ce-c64331ba0ebb" (UID: "05381036-1334-4a2b-a3ce-c64331ba0ebb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.666879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05381036-1334-4a2b-a3ce-c64331ba0ebb" (UID: "05381036-1334-4a2b-a3ce-c64331ba0ebb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.676515 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data" (OuterVolumeSpecName: "config-data") pod "05381036-1334-4a2b-a3ce-c64331ba0ebb" (UID: "05381036-1334-4a2b-a3ce-c64331ba0ebb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.739615 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.739651 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.739664 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05381036-1334-4a2b-a3ce-c64331ba0ebb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.739676 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq8zb\" (UniqueName: \"kubernetes.io/projected/05381036-1334-4a2b-a3ce-c64331ba0ebb-kube-api-access-pq8zb\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.758154 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.840449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data\") pod \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.840540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle\") pod \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.840910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5zm\" (UniqueName: \"kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm\") pod \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\" (UID: \"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b\") " Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.844013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm" (OuterVolumeSpecName: "kube-api-access-7t5zm") pod "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" (UID: "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b"). InnerVolumeSpecName "kube-api-access-7t5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.862754 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data" (OuterVolumeSpecName: "config-data") pod "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" (UID: "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.870848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" (UID: "dac5e784-2fc2-4ba4-b1c2-feb99a9a283b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.942065 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5zm\" (UniqueName: \"kubernetes.io/projected/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-kube-api-access-7t5zm\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.942092 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:22 crc kubenswrapper[4735]: I0317 01:32:22.942112 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.005616 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3cb1df1b-1b15-4cbf-a6f7-a834869872cf","Type":"ContainerDied","Data":"52a098e48957189329fcd57ee5aed66e5f87dde26a6be370c463174650e24c7b"} Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.005633 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.005669 4735 scope.go:117] "RemoveContainer" containerID="66dc2769d83a4bd530751f01287778e003b84d3375f2e57f1bcf1c42ba0a44bd" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.010344 4735 generic.go:334] "Generic (PLEG): container finished" podID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" containerID="53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e" exitCode=0 Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.010396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b","Type":"ContainerDied","Data":"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e"} Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.010419 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dac5e784-2fc2-4ba4-b1c2-feb99a9a283b","Type":"ContainerDied","Data":"e792df8ab1bc0cfc821088c0ca7cd148c9603b5462aa9c18b7b7527b1a6eea54"} Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.010454 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.022531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" event={"ID":"05381036-1334-4a2b-a3ce-c64331ba0ebb","Type":"ContainerDied","Data":"f245d92308b03fa9b82dcbf53ceb7c38ebd9b22b4d92eadcc8b01b2648e713e9"} Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.022568 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f245d92308b03fa9b82dcbf53ceb7c38ebd9b22b4d92eadcc8b01b2648e713e9" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.022624 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j9k5n" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.061544 4735 scope.go:117] "RemoveContainer" containerID="5456e77db7bf188423fcf0809e32e80d6204dba06a5e3755056552de64782144" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.155527 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.156126 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" containerName="nova-scheduler-scheduler" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.156231 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" containerName="nova-scheduler-scheduler" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.156319 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9384926e-1432-4a7d-9a9a-10b05219cc9e" containerName="nova-manage" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.156397 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9384926e-1432-4a7d-9a9a-10b05219cc9e" containerName="nova-manage" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.156478 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05381036-1334-4a2b-a3ce-c64331ba0ebb" containerName="nova-cell1-conductor-db-sync" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.156534 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="05381036-1334-4a2b-a3ce-c64331ba0ebb" containerName="nova-cell1-conductor-db-sync" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.156619 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="init" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.156694 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="init" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.156782 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-log" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.156837 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-log" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.157046 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-metadata" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157121 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-metadata" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.157178 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="dnsmasq-dns" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157251 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="dnsmasq-dns" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157527 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="05381036-1334-4a2b-a3ce-c64331ba0ebb" containerName="nova-cell1-conductor-db-sync" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157617 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="15817936-1648-4ce6-bfdd-c1cea98fe7e9" containerName="dnsmasq-dns" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157720 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-metadata" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157797 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9384926e-1432-4a7d-9a9a-10b05219cc9e" containerName="nova-manage" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157873 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" containerName="nova-metadata-log" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.157929 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" containerName="nova-scheduler-scheduler" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.162479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.166549 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.176517 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.179965 4735 scope.go:117] "RemoveContainer" containerID="53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.190717 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.214713 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.228403 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.244734 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.248762 4735 scope.go:117] "RemoveContainer" containerID="53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e" Mar 17 01:32:23 crc kubenswrapper[4735]: E0317 01:32:23.252395 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e\": container with ID starting with 53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e not found: ID does not exist" containerID="53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.252435 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e"} err="failed to get container status \"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e\": rpc error: code = NotFound desc = could not find container \"53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e\": container with ID starting with 53ee228154439d4d6d7af53736d4e015db01649f05fb08b861b823f887764d3e not found: ID does not exist" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.253758 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.254564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.254614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xv7\" (UniqueName: \"kubernetes.io/projected/0959245b-5ccd-4b6a-925a-1032c2761405-kube-api-access-p5xv7\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.254973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.255116 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.256761 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.264522 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.265849 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.268516 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.269735 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.284144 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.293391 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xv7\" (UniqueName: \"kubernetes.io/projected/0959245b-5ccd-4b6a-925a-1032c2761405-kube-api-access-p5xv7\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjx47\" (UniqueName: \"kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfh2\" (UniqueName: \"kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356784 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.356823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.357086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.357305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.357354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.357406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.374785 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.376246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959245b-5ccd-4b6a-925a-1032c2761405-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.382799 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xv7\" (UniqueName: \"kubernetes.io/projected/0959245b-5ccd-4b6a-925a-1032c2761405-kube-api-access-p5xv7\") pod \"nova-cell1-conductor-0\" (UID: \"0959245b-5ccd-4b6a-925a-1032c2761405\") " pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfh2\" (UniqueName: \"kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459462 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjx47\" (UniqueName: \"kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.459651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.460531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.462849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.463124 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.463579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.466803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.467940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.480643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfh2\" (UniqueName: \"kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2\") pod \"nova-metadata-0\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " pod="openstack/nova-metadata-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.482616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjx47\" (UniqueName: \"kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47\") pod \"nova-scheduler-0\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.548420 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.570055 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:32:23 crc kubenswrapper[4735]: I0317 01:32:23.586252 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:32:24 crc kubenswrapper[4735]: I0317 01:32:24.025298 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 01:32:24 crc kubenswrapper[4735]: W0317 01:32:24.027015 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0959245b_5ccd_4b6a_925a_1032c2761405.slice/crio-42f0f2446aad676a935356dc60c19c4c5ff87a9b506f66e653b3cd105b33dee3 WatchSource:0}: Error finding container 42f0f2446aad676a935356dc60c19c4c5ff87a9b506f66e653b3cd105b33dee3: Status 404 returned error can't find the container with id 42f0f2446aad676a935356dc60c19c4c5ff87a9b506f66e653b3cd105b33dee3 Mar 17 01:32:24 crc kubenswrapper[4735]: I0317 01:32:24.125081 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:32:24 crc kubenswrapper[4735]: I0317 01:32:24.203790 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.044584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8ca93142-4c33-4202-a5ac-119dc29437c6","Type":"ContainerStarted","Data":"7dda9bdb9df1a2a50aac859fae30ebc01efca6e92f63658d693c3f41adab4278"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.044881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8ca93142-4c33-4202-a5ac-119dc29437c6","Type":"ContainerStarted","Data":"fd55dd67baa17133b519b979f80245666e00af0d72e48885fa7e1322b8a0c43b"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.047154 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0959245b-5ccd-4b6a-925a-1032c2761405","Type":"ContainerStarted","Data":"c03929ec9aae8b02bbb88e1dced9fd25b6842698e1d7c85f0ea5d1e5709f12c0"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.047197 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0959245b-5ccd-4b6a-925a-1032c2761405","Type":"ContainerStarted","Data":"42f0f2446aad676a935356dc60c19c4c5ff87a9b506f66e653b3cd105b33dee3"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.047266 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.049847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerStarted","Data":"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.049898 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerStarted","Data":"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.049909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerStarted","Data":"7441c2f62d9fe9c064e30d1008ef07a6c7e1a07fe86607534cf8df0670fadfcc"} Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.071471 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.07145444 podStartE2EDuration="2.07145444s" podCreationTimestamp="2026-03-17 01:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:25.064802244 +0000 UTC m=+1370.697035222" watchObservedRunningTime="2026-03-17 01:32:25.07145444 +0000 UTC m=+1370.703687418" Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.088177 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb1df1b-1b15-4cbf-a6f7-a834869872cf" path="/var/lib/kubelet/pods/3cb1df1b-1b15-4cbf-a6f7-a834869872cf/volumes" Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.090825 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac5e784-2fc2-4ba4-b1c2-feb99a9a283b" path="/var/lib/kubelet/pods/dac5e784-2fc2-4ba4-b1c2-feb99a9a283b/volumes" Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.095343 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.095325706 podStartE2EDuration="2.095325706s" podCreationTimestamp="2026-03-17 01:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:25.091587688 +0000 UTC m=+1370.723820666" watchObservedRunningTime="2026-03-17 01:32:25.095325706 +0000 UTC m=+1370.727558684" Mar 17 01:32:25 crc kubenswrapper[4735]: I0317 01:32:25.113948 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.113930089 podStartE2EDuration="2.113930089s" podCreationTimestamp="2026-03-17 01:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:25.109838894 +0000 UTC m=+1370.742071872" watchObservedRunningTime="2026-03-17 01:32:25.113930089 +0000 UTC m=+1370.746163067" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.065116 4735 generic.go:334] "Generic (PLEG): container finished" podID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerID="dc48295c9226f3dbd7a5193f4fa2861088e17241c00116c75c9cfa76a649d6e2" exitCode=0 Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.066899 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerDied","Data":"dc48295c9226f3dbd7a5193f4fa2861088e17241c00116c75c9cfa76a649d6e2"} Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.190486 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.309871 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data\") pod \"505a18f2-c4f5-4d41-bff0-50b7fb538824\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.309950 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p8pw\" (UniqueName: \"kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw\") pod \"505a18f2-c4f5-4d41-bff0-50b7fb538824\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.310003 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs\") pod \"505a18f2-c4f5-4d41-bff0-50b7fb538824\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.310129 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle\") pod \"505a18f2-c4f5-4d41-bff0-50b7fb538824\" (UID: \"505a18f2-c4f5-4d41-bff0-50b7fb538824\") " Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.310468 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs" (OuterVolumeSpecName: "logs") pod "505a18f2-c4f5-4d41-bff0-50b7fb538824" (UID: "505a18f2-c4f5-4d41-bff0-50b7fb538824"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.310736 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/505a18f2-c4f5-4d41-bff0-50b7fb538824-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.318480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw" (OuterVolumeSpecName: "kube-api-access-9p8pw") pod "505a18f2-c4f5-4d41-bff0-50b7fb538824" (UID: "505a18f2-c4f5-4d41-bff0-50b7fb538824"). InnerVolumeSpecName "kube-api-access-9p8pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.339367 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "505a18f2-c4f5-4d41-bff0-50b7fb538824" (UID: "505a18f2-c4f5-4d41-bff0-50b7fb538824"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.344571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data" (OuterVolumeSpecName: "config-data") pod "505a18f2-c4f5-4d41-bff0-50b7fb538824" (UID: "505a18f2-c4f5-4d41-bff0-50b7fb538824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.412755 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.412789 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505a18f2-c4f5-4d41-bff0-50b7fb538824-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.412799 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p8pw\" (UniqueName: \"kubernetes.io/projected/505a18f2-c4f5-4d41-bff0-50b7fb538824-kube-api-access-9p8pw\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:26 crc kubenswrapper[4735]: I0317 01:32:26.934917 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.076155 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.085735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"505a18f2-c4f5-4d41-bff0-50b7fb538824","Type":"ContainerDied","Data":"26e003fb263eedfadd12bd2c5425f5aee47a5c9f5bb195c55e79ac71d3707212"} Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.085778 4735 scope.go:117] "RemoveContainer" containerID="dc48295c9226f3dbd7a5193f4fa2861088e17241c00116c75c9cfa76a649d6e2" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.106652 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.120278 4735 scope.go:117] "RemoveContainer" containerID="477517ba213534ea34f43e871daa58088be01ac7fcdf4ecfc028909ccbdbd882" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.119214 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.130075 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:27 crc kubenswrapper[4735]: E0317 01:32:27.130394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-log" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.130411 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-log" Mar 17 01:32:27 crc kubenswrapper[4735]: E0317 01:32:27.130445 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-api" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.130452 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-api" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.130619 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-api" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.130643 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" containerName="nova-api-log" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.131534 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.134478 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.153395 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.226994 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.227064 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hkt\" (UniqueName: \"kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.227108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.227182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.329453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.329554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.329717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.329770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hkt\" (UniqueName: \"kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.331425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.338015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.351585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.363299 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hkt\" (UniqueName: \"kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt\") pod \"nova-api-0\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.459502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:27 crc kubenswrapper[4735]: I0317 01:32:27.900240 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:27 crc kubenswrapper[4735]: W0317 01:32:27.923205 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod564a2452_2c8a_41c3_932b_d3e9731502e2.slice/crio-116fe84104915c7ff11eeaf46fdc95f48e94b392f4236a859b2834ce9e54ea71 WatchSource:0}: Error finding container 116fe84104915c7ff11eeaf46fdc95f48e94b392f4236a859b2834ce9e54ea71: Status 404 returned error can't find the container with id 116fe84104915c7ff11eeaf46fdc95f48e94b392f4236a859b2834ce9e54ea71 Mar 17 01:32:28 crc kubenswrapper[4735]: I0317 01:32:28.088085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerStarted","Data":"116fe84104915c7ff11eeaf46fdc95f48e94b392f4236a859b2834ce9e54ea71"} Mar 17 01:32:28 crc kubenswrapper[4735]: I0317 01:32:28.570300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 01:32:29 crc kubenswrapper[4735]: I0317 01:32:29.082508 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505a18f2-c4f5-4d41-bff0-50b7fb538824" path="/var/lib/kubelet/pods/505a18f2-c4f5-4d41-bff0-50b7fb538824/volumes" Mar 17 01:32:29 crc kubenswrapper[4735]: I0317 01:32:29.100575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerStarted","Data":"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7"} Mar 17 01:32:29 crc kubenswrapper[4735]: I0317 01:32:29.101980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerStarted","Data":"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad"} Mar 17 01:32:29 crc kubenswrapper[4735]: I0317 01:32:29.125296 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.125274818 podStartE2EDuration="2.125274818s" podCreationTimestamp="2026-03-17 01:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:29.116572585 +0000 UTC m=+1374.748805563" watchObservedRunningTime="2026-03-17 01:32:29.125274818 +0000 UTC m=+1374.757507806" Mar 17 01:32:30 crc kubenswrapper[4735]: I0317 01:32:30.441997 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:30 crc kubenswrapper[4735]: I0317 01:32:30.442204 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" containerName="kube-state-metrics" containerID="cri-o://5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3" gracePeriod=30 Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.054232 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.118313 4735 generic.go:334] "Generic (PLEG): container finished" podID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" containerID="5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3" exitCode=2 Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.118448 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.124742 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1dfe3748-6604-4210-a284-1f0bfdcdf01f","Type":"ContainerDied","Data":"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3"} Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.124774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1dfe3748-6604-4210-a284-1f0bfdcdf01f","Type":"ContainerDied","Data":"32b0ac083d8e54fa7f4f1558f28d94789387877e191df2670f38fe2db9c0644b"} Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.124793 4735 scope.go:117] "RemoveContainer" containerID="5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.173271 4735 scope.go:117] "RemoveContainer" containerID="5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3" Mar 17 01:32:31 crc kubenswrapper[4735]: E0317 01:32:31.173822 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3\": container with ID starting with 5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3 not found: ID does not exist" containerID="5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.173854 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3"} err="failed to get container status \"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3\": rpc error: code = NotFound desc = could not find container \"5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3\": container with ID starting with 5c01f72a84c2268a9faf6d1dd1f6221968b4f4c06bf283af80c04a5db1b662c3 not found: ID does not exist" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.201408 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfq92\" (UniqueName: \"kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92\") pod \"1dfe3748-6604-4210-a284-1f0bfdcdf01f\" (UID: \"1dfe3748-6604-4210-a284-1f0bfdcdf01f\") " Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.207952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92" (OuterVolumeSpecName: "kube-api-access-tfq92") pod "1dfe3748-6604-4210-a284-1f0bfdcdf01f" (UID: "1dfe3748-6604-4210-a284-1f0bfdcdf01f"). InnerVolumeSpecName "kube-api-access-tfq92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.303252 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfq92\" (UniqueName: \"kubernetes.io/projected/1dfe3748-6604-4210-a284-1f0bfdcdf01f-kube-api-access-tfq92\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.471819 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.479193 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.492805 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:31 crc kubenswrapper[4735]: E0317 01:32:31.493273 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" containerName="kube-state-metrics" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.493285 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" containerName="kube-state-metrics" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.493439 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" containerName="kube-state-metrics" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.494030 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.496156 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.496522 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.507885 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.610665 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.611006 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27d8\" (UniqueName: \"kubernetes.io/projected/0da67913-d53f-43fb-8100-c10acda35893-kube-api-access-h27d8\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.611266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.611357 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.713596 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.713710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.713752 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.713921 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27d8\" (UniqueName: \"kubernetes.io/projected/0da67913-d53f-43fb-8100-c10acda35893-kube-api-access-h27d8\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.717618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.718156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.719405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da67913-d53f-43fb-8100-c10acda35893-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.742068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27d8\" (UniqueName: \"kubernetes.io/projected/0da67913-d53f-43fb-8100-c10acda35893-kube-api-access-h27d8\") pod \"kube-state-metrics-0\" (UID: \"0da67913-d53f-43fb-8100-c10acda35893\") " pod="openstack/kube-state-metrics-0" Mar 17 01:32:31 crc kubenswrapper[4735]: I0317 01:32:31.817873 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.325194 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.418294 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.418543 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-central-agent" containerID="cri-o://cccd3a6760b6702e9d69c6188245bf360640cbe3c4dfc4f68224e1e2841ef165" gracePeriod=30 Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.418619 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="sg-core" containerID="cri-o://6aafdd5cb311e4f1a4c682844f51021f4ac983a71a9add6b070c6e3b5040923a" gracePeriod=30 Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.418623 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="proxy-httpd" containerID="cri-o://a2758603db03a8c385747f6d3a18cc38c2694322e5dd0bffa04f1c7a165bafa3" gracePeriod=30 Mar 17 01:32:32 crc kubenswrapper[4735]: I0317 01:32:32.418654 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-notification-agent" containerID="cri-o://91faf749add6a123dabdccc8d3cf17c7b85b229fa2e2ca1be83a410028ef93c7" gracePeriod=30 Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.094942 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfe3748-6604-4210-a284-1f0bfdcdf01f" path="/var/lib/kubelet/pods/1dfe3748-6604-4210-a284-1f0bfdcdf01f/volumes" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.158449 4735 generic.go:334] "Generic (PLEG): container finished" podID="e245b634-5cfa-4496-8b2e-86264de5854d" containerID="a2758603db03a8c385747f6d3a18cc38c2694322e5dd0bffa04f1c7a165bafa3" exitCode=0 Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.158787 4735 generic.go:334] "Generic (PLEG): container finished" podID="e245b634-5cfa-4496-8b2e-86264de5854d" containerID="6aafdd5cb311e4f1a4c682844f51021f4ac983a71a9add6b070c6e3b5040923a" exitCode=2 Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.158910 4735 generic.go:334] "Generic (PLEG): container finished" podID="e245b634-5cfa-4496-8b2e-86264de5854d" containerID="cccd3a6760b6702e9d69c6188245bf360640cbe3c4dfc4f68224e1e2841ef165" exitCode=0 Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.158560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerDied","Data":"a2758603db03a8c385747f6d3a18cc38c2694322e5dd0bffa04f1c7a165bafa3"} Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.159135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerDied","Data":"6aafdd5cb311e4f1a4c682844f51021f4ac983a71a9add6b070c6e3b5040923a"} Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.159215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerDied","Data":"cccd3a6760b6702e9d69c6188245bf360640cbe3c4dfc4f68224e1e2841ef165"} Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.160984 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0da67913-d53f-43fb-8100-c10acda35893","Type":"ContainerStarted","Data":"35f51d95d5addb7c40c58136bcc8620ea913bd7abc2fbcd9ee108d92e245c4bf"} Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.161025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0da67913-d53f-43fb-8100-c10acda35893","Type":"ContainerStarted","Data":"5c211997fae1a7e452f8145e4050d49bc6d5f8abb0702affedc941469f409b29"} Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.161463 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.570812 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.588033 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.588076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.602978 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.614582 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 01:32:33 crc kubenswrapper[4735]: I0317 01:32:33.621061 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.154256878 podStartE2EDuration="2.621038445s" podCreationTimestamp="2026-03-17 01:32:31 +0000 UTC" firstStartedPulling="2026-03-17 01:32:32.350426476 +0000 UTC m=+1377.982659464" lastFinishedPulling="2026-03-17 01:32:32.817208053 +0000 UTC m=+1378.449441031" observedRunningTime="2026-03-17 01:32:33.186283004 +0000 UTC m=+1378.818516072" watchObservedRunningTime="2026-03-17 01:32:33.621038445 +0000 UTC m=+1379.253271433" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.174060 4735 generic.go:334] "Generic (PLEG): container finished" podID="e245b634-5cfa-4496-8b2e-86264de5854d" containerID="91faf749add6a123dabdccc8d3cf17c7b85b229fa2e2ca1be83a410028ef93c7" exitCode=0 Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.174125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerDied","Data":"91faf749add6a123dabdccc8d3cf17c7b85b229fa2e2ca1be83a410028ef93c7"} Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.174180 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e245b634-5cfa-4496-8b2e-86264de5854d","Type":"ContainerDied","Data":"a8417a613047f43836027b3cd572af5f3b8630b1048535b65b17f1278854f6f3"} Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.174194 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8417a613047f43836027b3cd572af5f3b8630b1048535b65b17f1278854f6f3" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.175380 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.235323 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368476 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368496 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368546 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gng\" (UniqueName: \"kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.368763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd\") pod \"e245b634-5cfa-4496-8b2e-86264de5854d\" (UID: \"e245b634-5cfa-4496-8b2e-86264de5854d\") " Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.369556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.369661 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.370417 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.374985 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng" (OuterVolumeSpecName: "kube-api-access-c6gng") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "kube-api-access-c6gng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.383949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts" (OuterVolumeSpecName: "scripts") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.399273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.471645 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.471673 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.471682 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gng\" (UniqueName: \"kubernetes.io/projected/e245b634-5cfa-4496-8b2e-86264de5854d-kube-api-access-c6gng\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.471691 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e245b634-5cfa-4496-8b2e-86264de5854d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.527282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.538924 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data" (OuterVolumeSpecName: "config-data") pod "e245b634-5cfa-4496-8b2e-86264de5854d" (UID: "e245b634-5cfa-4496-8b2e-86264de5854d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.573143 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.573173 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e245b634-5cfa-4496-8b2e-86264de5854d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.601971 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:34 crc kubenswrapper[4735]: I0317 01:32:34.602015 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.184417 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.207558 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.219712 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.233469 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:35 crc kubenswrapper[4735]: E0317 01:32:35.233972 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-central-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234031 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-central-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: E0317 01:32:35.234083 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-notification-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234157 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-notification-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: E0317 01:32:35.234209 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="proxy-httpd" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234251 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="proxy-httpd" Mar 17 01:32:35 crc kubenswrapper[4735]: E0317 01:32:35.234309 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="sg-core" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234352 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="sg-core" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234545 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="sg-core" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234609 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-central-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234661 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="proxy-httpd" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.234711 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" containerName="ceilometer-notification-agent" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.236356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.243716 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.243764 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.248926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.258655 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387267 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5nj\" (UniqueName: \"kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387438 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387480 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.387616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5nj\" (UniqueName: \"kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.489738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.491392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.491443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.496633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.496659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.503340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.506087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.515166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.526502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5nj\" (UniqueName: \"kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj\") pod \"ceilometer-0\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " pod="openstack/ceilometer-0" Mar 17 01:32:35 crc kubenswrapper[4735]: I0317 01:32:35.570310 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:36 crc kubenswrapper[4735]: W0317 01:32:36.069694 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d8aab6_869a_468e_a54d_39fd9155c440.slice/crio-6450eda1ca967f22844fc39ad22e7f50ea5d733e1a7bb95d1a83b9bd0dc957f0 WatchSource:0}: Error finding container 6450eda1ca967f22844fc39ad22e7f50ea5d733e1a7bb95d1a83b9bd0dc957f0: Status 404 returned error can't find the container with id 6450eda1ca967f22844fc39ad22e7f50ea5d733e1a7bb95d1a83b9bd0dc957f0 Mar 17 01:32:36 crc kubenswrapper[4735]: I0317 01:32:36.081331 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:36 crc kubenswrapper[4735]: I0317 01:32:36.194630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerStarted","Data":"6450eda1ca967f22844fc39ad22e7f50ea5d733e1a7bb95d1a83b9bd0dc957f0"} Mar 17 01:32:37 crc kubenswrapper[4735]: I0317 01:32:37.082920 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e245b634-5cfa-4496-8b2e-86264de5854d" path="/var/lib/kubelet/pods/e245b634-5cfa-4496-8b2e-86264de5854d/volumes" Mar 17 01:32:37 crc kubenswrapper[4735]: I0317 01:32:37.204286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerStarted","Data":"5c932091dcdcd5e621cb486b1511cb8735272a8159ac8fc8bbacc8785766ab8c"} Mar 17 01:32:37 crc kubenswrapper[4735]: I0317 01:32:37.205000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerStarted","Data":"f9a9983577e1a29e4ef652707579c842f1ed6c7ddd48f31b83682a5f71ab5b74"} Mar 17 01:32:37 crc kubenswrapper[4735]: I0317 01:32:37.460456 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:32:37 crc kubenswrapper[4735]: I0317 01:32:37.460871 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:32:38 crc kubenswrapper[4735]: I0317 01:32:38.215054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerStarted","Data":"831715b09f3c87a99f53435edb7ec87118c457e92bfd53f1dc96125ff5472408"} Mar 17 01:32:38 crc kubenswrapper[4735]: I0317 01:32:38.543059 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:38 crc kubenswrapper[4735]: I0317 01:32:38.543147 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:32:39 crc kubenswrapper[4735]: I0317 01:32:39.236836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerStarted","Data":"733328c333bfae61ef37bdeea89a3866d3e1641d7360a7d6f881faaa1e62ed0a"} Mar 17 01:32:39 crc kubenswrapper[4735]: I0317 01:32:39.239815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:32:39 crc kubenswrapper[4735]: I0317 01:32:39.280949 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.449452959 podStartE2EDuration="4.280930652s" podCreationTimestamp="2026-03-17 01:32:35 +0000 UTC" firstStartedPulling="2026-03-17 01:32:36.072704218 +0000 UTC m=+1381.704937216" lastFinishedPulling="2026-03-17 01:32:38.904181931 +0000 UTC m=+1384.536414909" observedRunningTime="2026-03-17 01:32:39.267604281 +0000 UTC m=+1384.899837269" watchObservedRunningTime="2026-03-17 01:32:39.280930652 +0000 UTC m=+1384.913163630" Mar 17 01:32:41 crc kubenswrapper[4735]: I0317 01:32:41.587267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 01:32:41 crc kubenswrapper[4735]: I0317 01:32:41.589791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 01:32:41 crc kubenswrapper[4735]: I0317 01:32:41.850841 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 01:32:42 crc kubenswrapper[4735]: I0317 01:32:42.606467 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:32:42 crc kubenswrapper[4735]: I0317 01:32:42.606552 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:32:43 crc kubenswrapper[4735]: I0317 01:32:43.605969 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 01:32:43 crc kubenswrapper[4735]: I0317 01:32:43.606379 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 01:32:43 crc kubenswrapper[4735]: I0317 01:32:43.615741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 01:32:43 crc kubenswrapper[4735]: I0317 01:32:43.616467 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.246158 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.310061 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" containerID="1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16" exitCode=137 Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.310572 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.310572 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a","Type":"ContainerDied","Data":"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16"} Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.310639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a","Type":"ContainerDied","Data":"b1ac5f3167d308d4a55734ff6aa38073fa7493acee4049c3209c48f31fcb0ce0"} Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.310666 4735 scope.go:117] "RemoveContainer" containerID="1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.341519 4735 scope.go:117] "RemoveContainer" containerID="1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16" Mar 17 01:32:45 crc kubenswrapper[4735]: E0317 01:32:45.342023 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16\": container with ID starting with 1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16 not found: ID does not exist" containerID="1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.342059 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16"} err="failed to get container status \"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16\": rpc error: code = NotFound desc = could not find container \"1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16\": container with ID starting with 1d3b99e373e40d685ca90e10c3fffd134e1e515ac4ee2d8f09c5ce2ba7ae7d16 not found: ID does not exist" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.377985 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle\") pod \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.378178 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phhjq\" (UniqueName: \"kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq\") pod \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.378206 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data\") pod \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\" (UID: \"2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a\") " Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.383236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq" (OuterVolumeSpecName: "kube-api-access-phhjq") pod "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" (UID: "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a"). InnerVolumeSpecName "kube-api-access-phhjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.404416 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" (UID: "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.409730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data" (OuterVolumeSpecName: "config-data") pod "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" (UID: "2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.460317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.460366 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.481066 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phhjq\" (UniqueName: \"kubernetes.io/projected/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-kube-api-access-phhjq\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.481096 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.481106 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.653018 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.666880 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.680884 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:45 crc kubenswrapper[4735]: E0317 01:32:45.681236 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.681253 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.681434 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.681983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.686828 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.688878 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.699978 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.714478 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.786928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2k4\" (UniqueName: \"kubernetes.io/projected/f7a2283d-990f-46e0-b9e7-e2891468873c-kube-api-access-2z2k4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.786978 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.787055 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.787145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.787227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.888518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.888633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2k4\" (UniqueName: \"kubernetes.io/projected/f7a2283d-990f-46e0-b9e7-e2891468873c-kube-api-access-2z2k4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.888674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.888750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.888785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.894177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.894296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.895490 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.901721 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a2283d-990f-46e0-b9e7-e2891468873c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:45 crc kubenswrapper[4735]: I0317 01:32:45.905276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2k4\" (UniqueName: \"kubernetes.io/projected/f7a2283d-990f-46e0-b9e7-e2891468873c-kube-api-access-2z2k4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7a2283d-990f-46e0-b9e7-e2891468873c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:46 crc kubenswrapper[4735]: I0317 01:32:46.000659 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:46 crc kubenswrapper[4735]: I0317 01:32:46.490492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 01:32:46 crc kubenswrapper[4735]: W0317 01:32:46.492629 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a2283d_990f_46e0_b9e7_e2891468873c.slice/crio-371590b9831cc679dc301ee1b77fd9907306302e64d39d5be67aa3b27211e226 WatchSource:0}: Error finding container 371590b9831cc679dc301ee1b77fd9907306302e64d39d5be67aa3b27211e226: Status 404 returned error can't find the container with id 371590b9831cc679dc301ee1b77fd9907306302e64d39d5be67aa3b27211e226 Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.090120 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a" path="/var/lib/kubelet/pods/2bd68bfe-d87a-41b2-8498-2f5e64b0ff9a/volumes" Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.338459 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7a2283d-990f-46e0-b9e7-e2891468873c","Type":"ContainerStarted","Data":"5f3a479c077e2b15f371fab9d063969ec89225accc60e261379d9314dd60d195"} Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.338503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7a2283d-990f-46e0-b9e7-e2891468873c","Type":"ContainerStarted","Data":"371590b9831cc679dc301ee1b77fd9907306302e64d39d5be67aa3b27211e226"} Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.380889 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.380844969 podStartE2EDuration="2.380844969s" podCreationTimestamp="2026-03-17 01:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:47.368293616 +0000 UTC m=+1393.000526644" watchObservedRunningTime="2026-03-17 01:32:47.380844969 +0000 UTC m=+1393.013077957" Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.464471 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.466398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.470158 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 01:32:47 crc kubenswrapper[4735]: I0317 01:32:47.766265 4735 scope.go:117] "RemoveContainer" containerID="77fcb39e16c9768adfa0fa05624e0745133bc102fbdedae525200f364f4ba151" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.357667 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.626849 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.628374 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.660251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.745772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.745847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.745902 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rwk7\" (UniqueName: \"kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.745928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.745976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.746010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.847908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.848164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rwk7\" (UniqueName: \"kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.848195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.848245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.848280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.848348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.849365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.849373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.849388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.850006 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.850089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:48 crc kubenswrapper[4735]: I0317 01:32:48.899267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rwk7\" (UniqueName: \"kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7\") pod \"dnsmasq-dns-5f77cfbd7c-pjq9q\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:49 crc kubenswrapper[4735]: I0317 01:32:49.002809 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:49 crc kubenswrapper[4735]: W0317 01:32:49.489494 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61734c18_4914_46f6_8994_0801068b497b.slice/crio-fd7f0c7f56a3b13dd837274af05c9ae9029b01c82ef5ca125d564b3d45d9a9c4 WatchSource:0}: Error finding container fd7f0c7f56a3b13dd837274af05c9ae9029b01c82ef5ca125d564b3d45d9a9c4: Status 404 returned error can't find the container with id fd7f0c7f56a3b13dd837274af05c9ae9029b01c82ef5ca125d564b3d45d9a9c4 Mar 17 01:32:49 crc kubenswrapper[4735]: I0317 01:32:49.489754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.380211 4735 generic.go:334] "Generic (PLEG): container finished" podID="61734c18-4914-46f6-8994-0801068b497b" containerID="4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac" exitCode=0 Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.380302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" event={"ID":"61734c18-4914-46f6-8994-0801068b497b","Type":"ContainerDied","Data":"4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac"} Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.380823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" event={"ID":"61734c18-4914-46f6-8994-0801068b497b","Type":"ContainerStarted","Data":"fd7f0c7f56a3b13dd837274af05c9ae9029b01c82ef5ca125d564b3d45d9a9c4"} Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.832095 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.832560 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-central-agent" containerID="cri-o://f9a9983577e1a29e4ef652707579c842f1ed6c7ddd48f31b83682a5f71ab5b74" gracePeriod=30 Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.833213 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="proxy-httpd" containerID="cri-o://733328c333bfae61ef37bdeea89a3866d3e1641d7360a7d6f881faaa1e62ed0a" gracePeriod=30 Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.833262 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="sg-core" containerID="cri-o://831715b09f3c87a99f53435edb7ec87118c457e92bfd53f1dc96125ff5472408" gracePeriod=30 Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.833290 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-notification-agent" containerID="cri-o://5c932091dcdcd5e621cb486b1511cb8735272a8159ac8fc8bbacc8785766ab8c" gracePeriod=30 Mar 17 01:32:50 crc kubenswrapper[4735]: I0317 01:32:50.841978 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.223:3000/\": read tcp 10.217.0.2:35510->10.217.0.223:3000: read: connection reset by peer" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.001443 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.114293 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.394283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" event={"ID":"61734c18-4914-46f6-8994-0801068b497b","Type":"ContainerStarted","Data":"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7"} Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.394359 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.398746 4735 generic.go:334] "Generic (PLEG): container finished" podID="32d8aab6-869a-468e-a54d-39fd9155c440" containerID="733328c333bfae61ef37bdeea89a3866d3e1641d7360a7d6f881faaa1e62ed0a" exitCode=0 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.398792 4735 generic.go:334] "Generic (PLEG): container finished" podID="32d8aab6-869a-468e-a54d-39fd9155c440" containerID="831715b09f3c87a99f53435edb7ec87118c457e92bfd53f1dc96125ff5472408" exitCode=2 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.398802 4735 generic.go:334] "Generic (PLEG): container finished" podID="32d8aab6-869a-468e-a54d-39fd9155c440" containerID="5c932091dcdcd5e621cb486b1511cb8735272a8159ac8fc8bbacc8785766ab8c" exitCode=0 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.398810 4735 generic.go:334] "Generic (PLEG): container finished" podID="32d8aab6-869a-468e-a54d-39fd9155c440" containerID="f9a9983577e1a29e4ef652707579c842f1ed6c7ddd48f31b83682a5f71ab5b74" exitCode=0 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.398978 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-log" containerID="cri-o://f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad" gracePeriod=30 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.399036 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-api" containerID="cri-o://e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7" gracePeriod=30 Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.399073 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerDied","Data":"733328c333bfae61ef37bdeea89a3866d3e1641d7360a7d6f881faaa1e62ed0a"} Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.399115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerDied","Data":"831715b09f3c87a99f53435edb7ec87118c457e92bfd53f1dc96125ff5472408"} Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.399129 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerDied","Data":"5c932091dcdcd5e621cb486b1511cb8735272a8159ac8fc8bbacc8785766ab8c"} Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.399138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerDied","Data":"f9a9983577e1a29e4ef652707579c842f1ed6c7ddd48f31b83682a5f71ab5b74"} Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.431660 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" podStartSLOduration=3.431643067 podStartE2EDuration="3.431643067s" podCreationTimestamp="2026-03-17 01:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:51.423978248 +0000 UTC m=+1397.056211226" watchObservedRunningTime="2026-03-17 01:32:51.431643067 +0000 UTC m=+1397.063876035" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.671892 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.704758 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml5nj\" (UniqueName: \"kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.704928 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705044 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705136 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705172 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd\") pod \"32d8aab6-869a-468e-a54d-39fd9155c440\" (UID: \"32d8aab6-869a-468e-a54d-39fd9155c440\") " Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705761 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.705918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.706199 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.706227 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32d8aab6-869a-468e-a54d-39fd9155c440-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.710850 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj" (OuterVolumeSpecName: "kube-api-access-ml5nj") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "kube-api-access-ml5nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.711569 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts" (OuterVolumeSpecName: "scripts") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.747712 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.777990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.808015 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.808049 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml5nj\" (UniqueName: \"kubernetes.io/projected/32d8aab6-869a-468e-a54d-39fd9155c440-kube-api-access-ml5nj\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.808042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data" (OuterVolumeSpecName: "config-data") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.808064 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.808131 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.810033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32d8aab6-869a-468e-a54d-39fd9155c440" (UID: "32d8aab6-869a-468e-a54d-39fd9155c440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.910027 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:51 crc kubenswrapper[4735]: I0317 01:32:51.910055 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d8aab6-869a-468e-a54d-39fd9155c440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.410182 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32d8aab6-869a-468e-a54d-39fd9155c440","Type":"ContainerDied","Data":"6450eda1ca967f22844fc39ad22e7f50ea5d733e1a7bb95d1a83b9bd0dc957f0"} Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.410218 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.410499 4735 scope.go:117] "RemoveContainer" containerID="733328c333bfae61ef37bdeea89a3866d3e1641d7360a7d6f881faaa1e62ed0a" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.425498 4735 generic.go:334] "Generic (PLEG): container finished" podID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerID="f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad" exitCode=143 Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.425954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerDied","Data":"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad"} Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.450261 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.452740 4735 scope.go:117] "RemoveContainer" containerID="831715b09f3c87a99f53435edb7ec87118c457e92bfd53f1dc96125ff5472408" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.464051 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.471718 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:52 crc kubenswrapper[4735]: E0317 01:32:52.474695 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="proxy-httpd" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.474731 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="proxy-httpd" Mar 17 01:32:52 crc kubenswrapper[4735]: E0317 01:32:52.474763 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-central-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.474770 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-central-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: E0317 01:32:52.474796 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-notification-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.474804 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-notification-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: E0317 01:32:52.474815 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="sg-core" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.474820 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="sg-core" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.475064 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="sg-core" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.475081 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-notification-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.475093 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="ceilometer-central-agent" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.475124 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" containerName="proxy-httpd" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.477031 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.477510 4735 scope.go:117] "RemoveContainer" containerID="5c932091dcdcd5e621cb486b1511cb8735272a8159ac8fc8bbacc8785766ab8c" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.483026 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.483110 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.483967 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.490020 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.517020 4735 scope.go:117] "RemoveContainer" containerID="f9a9983577e1a29e4ef652707579c842f1ed6c7ddd48f31b83682a5f71ab5b74" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520461 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520496 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520521 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520545 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8mt6\" (UniqueName: \"kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.520566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8mt6\" (UniqueName: \"kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.622949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.623006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.626381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.626641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.627073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.627223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.633381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.633484 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.634274 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.641012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8mt6\" (UniqueName: \"kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6\") pod \"ceilometer-0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.803952 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:32:52 crc kubenswrapper[4735]: I0317 01:32:52.980392 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:53 crc kubenswrapper[4735]: I0317 01:32:53.086515 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d8aab6-869a-468e-a54d-39fd9155c440" path="/var/lib/kubelet/pods/32d8aab6-869a-468e-a54d-39fd9155c440/volumes" Mar 17 01:32:53 crc kubenswrapper[4735]: I0317 01:32:53.334441 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:32:53 crc kubenswrapper[4735]: I0317 01:32:53.439740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerStarted","Data":"d2acb2263386374b0bc5986ac33e70134191575c351dc0ac8854810cc4d9dc6f"} Mar 17 01:32:54 crc kubenswrapper[4735]: I0317 01:32:54.451837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerStarted","Data":"50843384154bbb40178842e70addad049293a918e0e745b0fbff961692eb171c"} Mar 17 01:32:54 crc kubenswrapper[4735]: I0317 01:32:54.453750 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerStarted","Data":"b27cf6d8ef5ac74fa72905b2b3e8ebf15e28b25d68410183d6f6e677342e438e"} Mar 17 01:32:54 crc kubenswrapper[4735]: I0317 01:32:54.972999 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.089348 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data\") pod \"564a2452-2c8a-41c3-932b-d3e9731502e2\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.089521 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs\") pod \"564a2452-2c8a-41c3-932b-d3e9731502e2\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.089598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle\") pod \"564a2452-2c8a-41c3-932b-d3e9731502e2\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.089656 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hkt\" (UniqueName: \"kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt\") pod \"564a2452-2c8a-41c3-932b-d3e9731502e2\" (UID: \"564a2452-2c8a-41c3-932b-d3e9731502e2\") " Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.091315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs" (OuterVolumeSpecName: "logs") pod "564a2452-2c8a-41c3-932b-d3e9731502e2" (UID: "564a2452-2c8a-41c3-932b-d3e9731502e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.098925 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt" (OuterVolumeSpecName: "kube-api-access-r9hkt") pod "564a2452-2c8a-41c3-932b-d3e9731502e2" (UID: "564a2452-2c8a-41c3-932b-d3e9731502e2"). InnerVolumeSpecName "kube-api-access-r9hkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.131705 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data" (OuterVolumeSpecName: "config-data") pod "564a2452-2c8a-41c3-932b-d3e9731502e2" (UID: "564a2452-2c8a-41c3-932b-d3e9731502e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.157164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "564a2452-2c8a-41c3-932b-d3e9731502e2" (UID: "564a2452-2c8a-41c3-932b-d3e9731502e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.195340 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564a2452-2c8a-41c3-932b-d3e9731502e2-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.195395 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.195405 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hkt\" (UniqueName: \"kubernetes.io/projected/564a2452-2c8a-41c3-932b-d3e9731502e2-kube-api-access-r9hkt\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.195414 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564a2452-2c8a-41c3-932b-d3e9731502e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.462069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerStarted","Data":"61ff74f9fd46346d6c44802a63bfa4e89c16c6ec7e9cc4c17a6acd0541931b28"} Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.464034 4735 generic.go:334] "Generic (PLEG): container finished" podID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerID="e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7" exitCode=0 Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.464084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerDied","Data":"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7"} Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.464102 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.464130 4735 scope.go:117] "RemoveContainer" containerID="e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.464118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"564a2452-2c8a-41c3-932b-d3e9731502e2","Type":"ContainerDied","Data":"116fe84104915c7ff11eeaf46fdc95f48e94b392f4236a859b2834ce9e54ea71"} Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.488489 4735 scope.go:117] "RemoveContainer" containerID="f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.505614 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.512774 4735 scope.go:117] "RemoveContainer" containerID="e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7" Mar 17 01:32:55 crc kubenswrapper[4735]: E0317 01:32:55.513894 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7\": container with ID starting with e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7 not found: ID does not exist" containerID="e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.513937 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7"} err="failed to get container status \"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7\": rpc error: code = NotFound desc = could not find container \"e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7\": container with ID starting with e6dcb81540d6fdbb0e277418a071385d60b6ebd9493fda08399d11d63c9706d7 not found: ID does not exist" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.513966 4735 scope.go:117] "RemoveContainer" containerID="f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad" Mar 17 01:32:55 crc kubenswrapper[4735]: E0317 01:32:55.514373 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad\": container with ID starting with f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad not found: ID does not exist" containerID="f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.514404 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad"} err="failed to get container status \"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad\": rpc error: code = NotFound desc = could not find container \"f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad\": container with ID starting with f83f1ce9390a83e04cb739101a4bc702634f7cf63f96d254ce6b7205169e8bad not found: ID does not exist" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.519253 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.535574 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:55 crc kubenswrapper[4735]: E0317 01:32:55.536046 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-log" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.536068 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-log" Mar 17 01:32:55 crc kubenswrapper[4735]: E0317 01:32:55.536084 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-api" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.536098 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-api" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.536291 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-log" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.536320 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" containerName="nova-api-api" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.537449 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.539908 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.543038 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.543237 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.544465 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602666 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.602763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdbf\" (UniqueName: \"kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.704989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.705048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.705072 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.705134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.705157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdbf\" (UniqueName: \"kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.705196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.706008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.709822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.710333 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.724782 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.725422 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.725952 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdbf\" (UniqueName: \"kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf\") pod \"nova-api-0\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " pod="openstack/nova-api-0" Mar 17 01:32:55 crc kubenswrapper[4735]: I0317 01:32:55.857921 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.001726 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.033066 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.152325 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.481230 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerStarted","Data":"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285"} Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.481606 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerStarted","Data":"0da9e821d19e29e205738f7682f6fb9a67fb2f6e6ada0a99db6e8375a6bc582b"} Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.502965 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.727876 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rxr77"] Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.731188 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.738310 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.738629 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.778731 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rxr77"] Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.834648 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.834728 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zj92\" (UniqueName: \"kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.834764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.834816 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.936453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zj92\" (UniqueName: \"kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.936532 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.936599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.936744 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.941395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.941749 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.942972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:56 crc kubenswrapper[4735]: I0317 01:32:56.953322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zj92\" (UniqueName: \"kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92\") pod \"nova-cell1-cell-mapping-rxr77\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.056080 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.090132 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564a2452-2c8a-41c3-932b-d3e9731502e2" path="/var/lib/kubelet/pods/564a2452-2c8a-41c3-932b-d3e9731502e2/volumes" Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.497109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerStarted","Data":"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b"} Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507341 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-central-agent" containerID="cri-o://b27cf6d8ef5ac74fa72905b2b3e8ebf15e28b25d68410183d6f6e677342e438e" gracePeriod=30 Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerStarted","Data":"7a071a3cb2dbe118048ec162ac705f89d0d6f9b22b5f16a5d1b1e6ee98c86e43"} Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507460 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507494 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="proxy-httpd" containerID="cri-o://7a071a3cb2dbe118048ec162ac705f89d0d6f9b22b5f16a5d1b1e6ee98c86e43" gracePeriod=30 Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507528 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="sg-core" containerID="cri-o://61ff74f9fd46346d6c44802a63bfa4e89c16c6ec7e9cc4c17a6acd0541931b28" gracePeriod=30 Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.507568 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-notification-agent" containerID="cri-o://50843384154bbb40178842e70addad049293a918e0e745b0fbff961692eb171c" gracePeriod=30 Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.516901 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rxr77"] Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.548802 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.548784508 podStartE2EDuration="2.548784508s" podCreationTimestamp="2026-03-17 01:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:57.544136579 +0000 UTC m=+1403.176369547" watchObservedRunningTime="2026-03-17 01:32:57.548784508 +0000 UTC m=+1403.181017486" Mar 17 01:32:57 crc kubenswrapper[4735]: I0317 01:32:57.578301 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.515133193 podStartE2EDuration="5.578279576s" podCreationTimestamp="2026-03-17 01:32:52 +0000 UTC" firstStartedPulling="2026-03-17 01:32:53.333320443 +0000 UTC m=+1398.965553461" lastFinishedPulling="2026-03-17 01:32:56.396466866 +0000 UTC m=+1402.028699844" observedRunningTime="2026-03-17 01:32:57.573578815 +0000 UTC m=+1403.205811793" watchObservedRunningTime="2026-03-17 01:32:57.578279576 +0000 UTC m=+1403.210512544" Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.522626 4735 generic.go:334] "Generic (PLEG): container finished" podID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerID="7a071a3cb2dbe118048ec162ac705f89d0d6f9b22b5f16a5d1b1e6ee98c86e43" exitCode=0 Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.523263 4735 generic.go:334] "Generic (PLEG): container finished" podID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerID="61ff74f9fd46346d6c44802a63bfa4e89c16c6ec7e9cc4c17a6acd0541931b28" exitCode=2 Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.523292 4735 generic.go:334] "Generic (PLEG): container finished" podID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerID="50843384154bbb40178842e70addad049293a918e0e745b0fbff961692eb171c" exitCode=0 Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.522670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerDied","Data":"7a071a3cb2dbe118048ec162ac705f89d0d6f9b22b5f16a5d1b1e6ee98c86e43"} Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.523465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerDied","Data":"61ff74f9fd46346d6c44802a63bfa4e89c16c6ec7e9cc4c17a6acd0541931b28"} Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.523520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerDied","Data":"50843384154bbb40178842e70addad049293a918e0e745b0fbff961692eb171c"} Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.526586 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rxr77" event={"ID":"de93197b-d6c2-444b-a532-87f1534094c3","Type":"ContainerStarted","Data":"53a61edfce07e1916ab0378827250762dd542547d73f9c3573823b20be6aa4d6"} Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.526630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rxr77" event={"ID":"de93197b-d6c2-444b-a532-87f1534094c3","Type":"ContainerStarted","Data":"bd40e4386e994eb374246f3f7d2d5e1d69f04ba6f361fb2252a43df073330f18"} Mar 17 01:32:58 crc kubenswrapper[4735]: I0317 01:32:58.552520 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rxr77" podStartSLOduration=2.552499929 podStartE2EDuration="2.552499929s" podCreationTimestamp="2026-03-17 01:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:32:58.544502072 +0000 UTC m=+1404.176735100" watchObservedRunningTime="2026-03-17 01:32:58.552499929 +0000 UTC m=+1404.184732917" Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.005069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.101157 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.101386 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="dnsmasq-dns" containerID="cri-o://0f90e5ff83cd5a8d9ab5b61040decb7d59fdead7810def6cd46c32936d87c062" gracePeriod=10 Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.554084 4735 generic.go:334] "Generic (PLEG): container finished" podID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerID="0f90e5ff83cd5a8d9ab5b61040decb7d59fdead7810def6cd46c32936d87c062" exitCode=0 Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.555117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" event={"ID":"c654c0cc-f663-4a38-9872-f4c10a4ee171","Type":"ContainerDied","Data":"0f90e5ff83cd5a8d9ab5b61040decb7d59fdead7810def6cd46c32936d87c062"} Mar 17 01:32:59 crc kubenswrapper[4735]: I0317 01:32:59.926437 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.043951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.044100 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.044174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fgjm\" (UniqueName: \"kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.044197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.045068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.045205 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc\") pod \"c654c0cc-f663-4a38-9872-f4c10a4ee171\" (UID: \"c654c0cc-f663-4a38-9872-f4c10a4ee171\") " Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.057100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm" (OuterVolumeSpecName: "kube-api-access-9fgjm") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "kube-api-access-9fgjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.097212 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.140372 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.141255 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.149078 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.149101 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fgjm\" (UniqueName: \"kubernetes.io/projected/c654c0cc-f663-4a38-9872-f4c10a4ee171-kube-api-access-9fgjm\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.149111 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.149119 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.156201 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config" (OuterVolumeSpecName: "config") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.157452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c654c0cc-f663-4a38-9872-f4c10a4ee171" (UID: "c654c0cc-f663-4a38-9872-f4c10a4ee171"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.251202 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.251233 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c654c0cc-f663-4a38-9872-f4c10a4ee171-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.566912 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" event={"ID":"c654c0cc-f663-4a38-9872-f4c10a4ee171","Type":"ContainerDied","Data":"fe299b728aa4a8c4222a53deeeb8a5a954dfec2e516d3a83ff0335bd93b0d4ed"} Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.567218 4735 scope.go:117] "RemoveContainer" containerID="0f90e5ff83cd5a8d9ab5b61040decb7d59fdead7810def6cd46c32936d87c062" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.567006 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546765fdff-hb4kb" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.589264 4735 scope.go:117] "RemoveContainer" containerID="7a1a644397a3c4f906dec8527d161cc291be57a07d0bee4477dbceecc56fcccf" Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.613240 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:33:00 crc kubenswrapper[4735]: I0317 01:33:00.623250 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546765fdff-hb4kb"] Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.083178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" path="/var/lib/kubelet/pods/c654c0cc-f663-4a38-9872-f4c10a4ee171/volumes" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.583781 4735 generic.go:334] "Generic (PLEG): container finished" podID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerID="b27cf6d8ef5ac74fa72905b2b3e8ebf15e28b25d68410183d6f6e677342e438e" exitCode=0 Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.583919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerDied","Data":"b27cf6d8ef5ac74fa72905b2b3e8ebf15e28b25d68410183d6f6e677342e438e"} Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.584164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77908703-19fa-4ace-8d06-72b05fa0dbe0","Type":"ContainerDied","Data":"d2acb2263386374b0bc5986ac33e70134191575c351dc0ac8854810cc4d9dc6f"} Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.584181 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2acb2263386374b0bc5986ac33e70134191575c351dc0ac8854810cc4d9dc6f" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.609758 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.679813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8mt6\" (UniqueName: \"kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.679913 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.679952 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.679971 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.680105 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.680154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.680176 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.680258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle\") pod \"77908703-19fa-4ace-8d06-72b05fa0dbe0\" (UID: \"77908703-19fa-4ace-8d06-72b05fa0dbe0\") " Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.681696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.682059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.687088 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts" (OuterVolumeSpecName: "scripts") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.687818 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6" (OuterVolumeSpecName: "kube-api-access-g8mt6") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "kube-api-access-g8mt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.714466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.775406 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784105 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784172 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784633 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8mt6\" (UniqueName: \"kubernetes.io/projected/77908703-19fa-4ace-8d06-72b05fa0dbe0-kube-api-access-g8mt6\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784657 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784670 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784688 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77908703-19fa-4ace-8d06-72b05fa0dbe0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.784701 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.872877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data" (OuterVolumeSpecName: "config-data") pod "77908703-19fa-4ace-8d06-72b05fa0dbe0" (UID: "77908703-19fa-4ace-8d06-72b05fa0dbe0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.887572 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:01 crc kubenswrapper[4735]: I0317 01:33:01.887611 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77908703-19fa-4ace-8d06-72b05fa0dbe0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.593328 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.656982 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.676763 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.705266 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.705979 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="init" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706012 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="init" Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.706026 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-central-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706040 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-central-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.706068 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-notification-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706081 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-notification-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.706109 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="sg-core" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706121 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="sg-core" Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.706151 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="dnsmasq-dns" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706161 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="dnsmasq-dns" Mar 17 01:33:02 crc kubenswrapper[4735]: E0317 01:33:02.706179 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="proxy-httpd" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706189 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="proxy-httpd" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706485 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-notification-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706526 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="sg-core" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706562 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="ceilometer-central-agent" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706577 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c654c0cc-f663-4a38-9872-f4c10a4ee171" containerName="dnsmasq-dns" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.706593 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" containerName="proxy-httpd" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.709677 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.711605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.712289 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.712361 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.716356 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rq8f\" (UniqueName: \"kubernetes.io/projected/f1e8c020-341a-416d-9816-fe9fece292ec-kube-api-access-5rq8f\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814384 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814727 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-log-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-scripts\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.814981 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.815104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-config-data\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.815179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-run-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.916802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-log-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-scripts\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-config-data\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-run-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917249 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rq8f\" (UniqueName: \"kubernetes.io/projected/f1e8c020-341a-416d-9816-fe9fece292ec-kube-api-access-5rq8f\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.917998 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-run-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.918003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1e8c020-341a-416d-9816-fe9fece292ec-log-httpd\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.922775 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.924063 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.924475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-config-data\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.925194 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.926247 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e8c020-341a-416d-9816-fe9fece292ec-scripts\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:02 crc kubenswrapper[4735]: I0317 01:33:02.946700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rq8f\" (UniqueName: \"kubernetes.io/projected/f1e8c020-341a-416d-9816-fe9fece292ec-kube-api-access-5rq8f\") pod \"ceilometer-0\" (UID: \"f1e8c020-341a-416d-9816-fe9fece292ec\") " pod="openstack/ceilometer-0" Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.045724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.084801 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77908703-19fa-4ace-8d06-72b05fa0dbe0" path="/var/lib/kubelet/pods/77908703-19fa-4ace-8d06-72b05fa0dbe0/volumes" Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.535649 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.605679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1e8c020-341a-416d-9816-fe9fece292ec","Type":"ContainerStarted","Data":"e0e95c9ffa9b7893def119a31ecb5e6b288f766b792dd94ee41c9b01fe15bd90"} Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.609002 4735 generic.go:334] "Generic (PLEG): container finished" podID="de93197b-d6c2-444b-a532-87f1534094c3" containerID="53a61edfce07e1916ab0378827250762dd542547d73f9c3573823b20be6aa4d6" exitCode=0 Mar 17 01:33:03 crc kubenswrapper[4735]: I0317 01:33:03.609045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rxr77" event={"ID":"de93197b-d6c2-444b-a532-87f1534094c3","Type":"ContainerDied","Data":"53a61edfce07e1916ab0378827250762dd542547d73f9c3573823b20be6aa4d6"} Mar 17 01:33:04 crc kubenswrapper[4735]: I0317 01:33:04.620538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1e8c020-341a-416d-9816-fe9fece292ec","Type":"ContainerStarted","Data":"e3c19e95e91e6b38d9718ef51103bb260d010303408bc50f9d6cb260ad25a0d8"} Mar 17 01:33:04 crc kubenswrapper[4735]: I0317 01:33:04.621288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1e8c020-341a-416d-9816-fe9fece292ec","Type":"ContainerStarted","Data":"57c0d68770f5f6b6489f0ea751e844be6b320ea84ecf9c8ff313e13cc72dfa96"} Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.054166 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.186482 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle\") pod \"de93197b-d6c2-444b-a532-87f1534094c3\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.186591 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zj92\" (UniqueName: \"kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92\") pod \"de93197b-d6c2-444b-a532-87f1534094c3\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.186646 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts\") pod \"de93197b-d6c2-444b-a532-87f1534094c3\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.186673 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data\") pod \"de93197b-d6c2-444b-a532-87f1534094c3\" (UID: \"de93197b-d6c2-444b-a532-87f1534094c3\") " Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.194126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92" (OuterVolumeSpecName: "kube-api-access-2zj92") pod "de93197b-d6c2-444b-a532-87f1534094c3" (UID: "de93197b-d6c2-444b-a532-87f1534094c3"). InnerVolumeSpecName "kube-api-access-2zj92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.195993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts" (OuterVolumeSpecName: "scripts") pod "de93197b-d6c2-444b-a532-87f1534094c3" (UID: "de93197b-d6c2-444b-a532-87f1534094c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.227976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de93197b-d6c2-444b-a532-87f1534094c3" (UID: "de93197b-d6c2-444b-a532-87f1534094c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.241242 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data" (OuterVolumeSpecName: "config-data") pod "de93197b-d6c2-444b-a532-87f1534094c3" (UID: "de93197b-d6c2-444b-a532-87f1534094c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.289358 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.289385 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zj92\" (UniqueName: \"kubernetes.io/projected/de93197b-d6c2-444b-a532-87f1534094c3-kube-api-access-2zj92\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.289396 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.289405 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de93197b-d6c2-444b-a532-87f1534094c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.635499 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1e8c020-341a-416d-9816-fe9fece292ec","Type":"ContainerStarted","Data":"f022fb8d440f3487bc49d70d78b64d1049e06692f811e3e193b4ffbf6c9eaf7b"} Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.637770 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rxr77" event={"ID":"de93197b-d6c2-444b-a532-87f1534094c3","Type":"ContainerDied","Data":"bd40e4386e994eb374246f3f7d2d5e1d69f04ba6f361fb2252a43df073330f18"} Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.637812 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd40e4386e994eb374246f3f7d2d5e1d69f04ba6f361fb2252a43df073330f18" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.637938 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rxr77" Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.825457 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.826713 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-log" containerID="cri-o://fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" gracePeriod=30 Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.826929 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-api" containerID="cri-o://f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" gracePeriod=30 Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.844945 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.845111 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8ca93142-4c33-4202-a5ac-119dc29437c6" containerName="nova-scheduler-scheduler" containerID="cri-o://7dda9bdb9df1a2a50aac859fae30ebc01efca6e92f63658d693c3f41adab4278" gracePeriod=30 Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.865550 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.865771 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-log" containerID="cri-o://654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37" gracePeriod=30 Mar 17 01:33:05 crc kubenswrapper[4735]: I0317 01:33:05.865896 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-metadata" containerID="cri-o://2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5" gracePeriod=30 Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.407034 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526547 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzdbf\" (UniqueName: \"kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526612 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.526714 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle\") pod \"15471aae-c178-4dc7-b566-6996551dff05\" (UID: \"15471aae-c178-4dc7-b566-6996551dff05\") " Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.527753 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs" (OuterVolumeSpecName: "logs") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.536024 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf" (OuterVolumeSpecName: "kube-api-access-dzdbf") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "kube-api-access-dzdbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.566485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.587948 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data" (OuterVolumeSpecName: "config-data") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.593594 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.593780 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "15471aae-c178-4dc7-b566-6996551dff05" (UID: "15471aae-c178-4dc7-b566-6996551dff05"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629318 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629356 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzdbf\" (UniqueName: \"kubernetes.io/projected/15471aae-c178-4dc7-b566-6996551dff05-kube-api-access-dzdbf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629369 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15471aae-c178-4dc7-b566-6996551dff05-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629378 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629387 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.629396 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15471aae-c178-4dc7-b566-6996551dff05-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.651506 4735 generic.go:334] "Generic (PLEG): container finished" podID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerID="654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37" exitCode=143 Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.651568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerDied","Data":"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37"} Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653253 4735 generic.go:334] "Generic (PLEG): container finished" podID="15471aae-c178-4dc7-b566-6996551dff05" containerID="f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" exitCode=0 Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653281 4735 generic.go:334] "Generic (PLEG): container finished" podID="15471aae-c178-4dc7-b566-6996551dff05" containerID="fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" exitCode=143 Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerDied","Data":"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b"} Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerDied","Data":"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285"} Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15471aae-c178-4dc7-b566-6996551dff05","Type":"ContainerDied","Data":"0da9e821d19e29e205738f7682f6fb9a67fb2f6e6ada0a99db6e8375a6bc582b"} Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653350 4735 scope.go:117] "RemoveContainer" containerID="f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.653353 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.674289 4735 scope.go:117] "RemoveContainer" containerID="fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.698376 4735 scope.go:117] "RemoveContainer" containerID="f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" Mar 17 01:33:06 crc kubenswrapper[4735]: E0317 01:33:06.698922 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b\": container with ID starting with f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b not found: ID does not exist" containerID="f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.698991 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b"} err="failed to get container status \"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b\": rpc error: code = NotFound desc = could not find container \"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b\": container with ID starting with f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b not found: ID does not exist" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.699032 4735 scope.go:117] "RemoveContainer" containerID="fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" Mar 17 01:33:06 crc kubenswrapper[4735]: E0317 01:33:06.699458 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285\": container with ID starting with fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285 not found: ID does not exist" containerID="fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.699497 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285"} err="failed to get container status \"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285\": rpc error: code = NotFound desc = could not find container \"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285\": container with ID starting with fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285 not found: ID does not exist" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.699514 4735 scope.go:117] "RemoveContainer" containerID="f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.700169 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b"} err="failed to get container status \"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b\": rpc error: code = NotFound desc = could not find container \"f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b\": container with ID starting with f22d0122a8e486fb2fda5ed1c359c20d0ca68f6df361ad550f5af0b4c6c4072b not found: ID does not exist" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.700192 4735 scope.go:117] "RemoveContainer" containerID="fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.708396 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285"} err="failed to get container status \"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285\": rpc error: code = NotFound desc = could not find container \"fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285\": container with ID starting with fa944aebc2778e44fd69fb2890f657f5006a81397119071178fe8a3f778ee285 not found: ID does not exist" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.708517 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.716128 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727054 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:06 crc kubenswrapper[4735]: E0317 01:33:06.727420 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-api" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727437 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-api" Mar 17 01:33:06 crc kubenswrapper[4735]: E0317 01:33:06.727463 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-log" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727470 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-log" Mar 17 01:33:06 crc kubenswrapper[4735]: E0317 01:33:06.727489 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de93197b-d6c2-444b-a532-87f1534094c3" containerName="nova-manage" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727494 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="de93197b-d6c2-444b-a532-87f1534094c3" containerName="nova-manage" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727666 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="de93197b-d6c2-444b-a532-87f1534094c3" containerName="nova-manage" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727682 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-log" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.727691 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="15471aae-c178-4dc7-b566-6996551dff05" containerName="nova-api-api" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.729409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.734296 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.734443 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.734623 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.757263 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.832673 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-config-data\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.832724 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.832793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lscg6\" (UniqueName: \"kubernetes.io/projected/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-kube-api-access-lscg6\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.832811 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.832854 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.833024 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-logs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-config-data\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lscg6\" (UniqueName: \"kubernetes.io/projected/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-kube-api-access-lscg6\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.938259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-logs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.939043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-logs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.943520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.946387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.946616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-config-data\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.950358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:06 crc kubenswrapper[4735]: I0317 01:33:06.960158 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lscg6\" (UniqueName: \"kubernetes.io/projected/8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976-kube-api-access-lscg6\") pod \"nova-api-0\" (UID: \"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976\") " pod="openstack/nova-api-0" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.067941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.082187 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15471aae-c178-4dc7-b566-6996551dff05" path="/var/lib/kubelet/pods/15471aae-c178-4dc7-b566-6996551dff05/volumes" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.616549 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.669248 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1e8c020-341a-416d-9816-fe9fece292ec","Type":"ContainerStarted","Data":"b8f0c7816897ec4c5b0c408087a26681f183679ec2ee8698d1973c5ace01dafc"} Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.669673 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.671789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976","Type":"ContainerStarted","Data":"138c1255375915dd9fe12c6b438544cc7b6ddee97ee350b6979b1d4db1e4ecf3"} Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.677408 4735 generic.go:334] "Generic (PLEG): container finished" podID="8ca93142-4c33-4202-a5ac-119dc29437c6" containerID="7dda9bdb9df1a2a50aac859fae30ebc01efca6e92f63658d693c3f41adab4278" exitCode=0 Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.677432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8ca93142-4c33-4202-a5ac-119dc29437c6","Type":"ContainerDied","Data":"7dda9bdb9df1a2a50aac859fae30ebc01efca6e92f63658d693c3f41adab4278"} Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.705193 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.049472357 podStartE2EDuration="5.705177348s" podCreationTimestamp="2026-03-17 01:33:02 +0000 UTC" firstStartedPulling="2026-03-17 01:33:03.525503846 +0000 UTC m=+1409.157736864" lastFinishedPulling="2026-03-17 01:33:07.181208877 +0000 UTC m=+1412.813441855" observedRunningTime="2026-03-17 01:33:07.698017031 +0000 UTC m=+1413.330250019" watchObservedRunningTime="2026-03-17 01:33:07.705177348 +0000 UTC m=+1413.337410326" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.783649 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.981283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle\") pod \"8ca93142-4c33-4202-a5ac-119dc29437c6\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.981564 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjx47\" (UniqueName: \"kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47\") pod \"8ca93142-4c33-4202-a5ac-119dc29437c6\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.981611 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data\") pod \"8ca93142-4c33-4202-a5ac-119dc29437c6\" (UID: \"8ca93142-4c33-4202-a5ac-119dc29437c6\") " Mar 17 01:33:07 crc kubenswrapper[4735]: I0317 01:33:07.988192 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47" (OuterVolumeSpecName: "kube-api-access-rjx47") pod "8ca93142-4c33-4202-a5ac-119dc29437c6" (UID: "8ca93142-4c33-4202-a5ac-119dc29437c6"). InnerVolumeSpecName "kube-api-access-rjx47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.014469 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data" (OuterVolumeSpecName: "config-data") pod "8ca93142-4c33-4202-a5ac-119dc29437c6" (UID: "8ca93142-4c33-4202-a5ac-119dc29437c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.029762 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ca93142-4c33-4202-a5ac-119dc29437c6" (UID: "8ca93142-4c33-4202-a5ac-119dc29437c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.083409 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.083441 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjx47\" (UniqueName: \"kubernetes.io/projected/8ca93142-4c33-4202-a5ac-119dc29437c6-kube-api-access-rjx47\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.083453 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca93142-4c33-4202-a5ac-119dc29437c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.725949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8ca93142-4c33-4202-a5ac-119dc29437c6","Type":"ContainerDied","Data":"fd55dd67baa17133b519b979f80245666e00af0d72e48885fa7e1322b8a0c43b"} Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.726391 4735 scope.go:117] "RemoveContainer" containerID="7dda9bdb9df1a2a50aac859fae30ebc01efca6e92f63658d693c3f41adab4278" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.726541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.732612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976","Type":"ContainerStarted","Data":"4be32eed3b440fc3ffe3b4e35fdcf6191d000d45bb61786a1159088381fe33fc"} Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.732655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976","Type":"ContainerStarted","Data":"66e7a8122b9b7e188cd12924e5a6f0be654d35c983dd936f603554f7bb4a77a4"} Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.781961 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.781932521 podStartE2EDuration="2.781932521s" podCreationTimestamp="2026-03-17 01:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:33:08.759953318 +0000 UTC m=+1414.392186316" watchObservedRunningTime="2026-03-17 01:33:08.781932521 +0000 UTC m=+1414.414165509" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.816102 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.839004 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.861031 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:08 crc kubenswrapper[4735]: E0317 01:33:08.862213 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca93142-4c33-4202-a5ac-119dc29437c6" containerName="nova-scheduler-scheduler" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.862461 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca93142-4c33-4202-a5ac-119dc29437c6" containerName="nova-scheduler-scheduler" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.862887 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca93142-4c33-4202-a5ac-119dc29437c6" containerName="nova-scheduler-scheduler" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.863819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.866921 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.892746 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.914647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkmx\" (UniqueName: \"kubernetes.io/projected/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-kube-api-access-hvkmx\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.914791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:08 crc kubenswrapper[4735]: I0317 01:33:08.914835 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.016746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkmx\" (UniqueName: \"kubernetes.io/projected/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-kube-api-access-hvkmx\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.017023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.017094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.030794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.032907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.033612 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkmx\" (UniqueName: \"kubernetes.io/projected/6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b-kube-api-access-hvkmx\") pod \"nova-scheduler-0\" (UID: \"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b\") " pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.089503 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca93142-4c33-4202-a5ac-119dc29437c6" path="/var/lib/kubelet/pods/8ca93142-4c33-4202-a5ac-119dc29437c6/volumes" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.190705 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.453438 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.525370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs\") pod \"5093ba85-f6f4-4937-b6c2-f9b06f712145\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.525559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfh2\" (UniqueName: \"kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2\") pod \"5093ba85-f6f4-4937-b6c2-f9b06f712145\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.525656 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data\") pod \"5093ba85-f6f4-4937-b6c2-f9b06f712145\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.525750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs\") pod \"5093ba85-f6f4-4937-b6c2-f9b06f712145\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.525922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle\") pod \"5093ba85-f6f4-4937-b6c2-f9b06f712145\" (UID: \"5093ba85-f6f4-4937-b6c2-f9b06f712145\") " Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.526118 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs" (OuterVolumeSpecName: "logs") pod "5093ba85-f6f4-4937-b6c2-f9b06f712145" (UID: "5093ba85-f6f4-4937-b6c2-f9b06f712145"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.526589 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5093ba85-f6f4-4937-b6c2-f9b06f712145-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.539936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2" (OuterVolumeSpecName: "kube-api-access-kjfh2") pod "5093ba85-f6f4-4937-b6c2-f9b06f712145" (UID: "5093ba85-f6f4-4937-b6c2-f9b06f712145"). InnerVolumeSpecName "kube-api-access-kjfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.557046 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data" (OuterVolumeSpecName: "config-data") pod "5093ba85-f6f4-4937-b6c2-f9b06f712145" (UID: "5093ba85-f6f4-4937-b6c2-f9b06f712145"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.595948 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5093ba85-f6f4-4937-b6c2-f9b06f712145" (UID: "5093ba85-f6f4-4937-b6c2-f9b06f712145"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.615350 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5093ba85-f6f4-4937-b6c2-f9b06f712145" (UID: "5093ba85-f6f4-4937-b6c2-f9b06f712145"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.627288 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfh2\" (UniqueName: \"kubernetes.io/projected/5093ba85-f6f4-4937-b6c2-f9b06f712145-kube-api-access-kjfh2\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.627323 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.627334 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.627343 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093ba85-f6f4-4937-b6c2-f9b06f712145-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.766244 4735 generic.go:334] "Generic (PLEG): container finished" podID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerID="2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5" exitCode=0 Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.766335 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerDied","Data":"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5"} Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.766379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5093ba85-f6f4-4937-b6c2-f9b06f712145","Type":"ContainerDied","Data":"7441c2f62d9fe9c064e30d1008ef07a6c7e1a07fe86607534cf8df0670fadfcc"} Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.766402 4735 scope.go:117] "RemoveContainer" containerID="2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.766596 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.796791 4735 scope.go:117] "RemoveContainer" containerID="654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.806067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.827725 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.831668 4735 scope.go:117] "RemoveContainer" containerID="2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5" Mar 17 01:33:09 crc kubenswrapper[4735]: E0317 01:33:09.833491 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5\": container with ID starting with 2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5 not found: ID does not exist" containerID="2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.833531 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5"} err="failed to get container status \"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5\": rpc error: code = NotFound desc = could not find container \"2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5\": container with ID starting with 2d6eed13d0a8a38ee804b1d86681244625a3ffd3286bfed03ba73cabd6e2bdb5 not found: ID does not exist" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.833558 4735 scope.go:117] "RemoveContainer" containerID="654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37" Mar 17 01:33:09 crc kubenswrapper[4735]: E0317 01:33:09.833838 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37\": container with ID starting with 654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37 not found: ID does not exist" containerID="654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.833888 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37"} err="failed to get container status \"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37\": rpc error: code = NotFound desc = could not find container \"654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37\": container with ID starting with 654ad6cee1f6ac65c87749217516289ec19870e3a2958f99ccb2c23973957d37 not found: ID does not exist" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.843768 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.855937 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:09 crc kubenswrapper[4735]: E0317 01:33:09.856500 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-metadata" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.856519 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-metadata" Mar 17 01:33:09 crc kubenswrapper[4735]: E0317 01:33:09.856533 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-log" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.856540 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-log" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.856747 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-metadata" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.856766 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" containerName="nova-metadata-log" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.857930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.864946 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.883290 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.885992 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.939675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvx6d\" (UniqueName: \"kubernetes.io/projected/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-kube-api-access-hvx6d\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.939751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.939782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.939802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-config-data\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:09 crc kubenswrapper[4735]: I0317 01:33:09.939820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-logs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.041299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.041347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.041377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-config-data\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.041394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-logs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.041510 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvx6d\" (UniqueName: \"kubernetes.io/projected/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-kube-api-access-hvx6d\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.042263 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-logs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.045846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.047227 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.047619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-config-data\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.061380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvx6d\" (UniqueName: \"kubernetes.io/projected/6a8bc38a-47a7-4623-bf64-ac77d137ceb4-kube-api-access-hvx6d\") pod \"nova-metadata-0\" (UID: \"6a8bc38a-47a7-4623-bf64-ac77d137ceb4\") " pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.184230 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.690011 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.785097 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b","Type":"ContainerStarted","Data":"aa85eed412b7f0cf2e6ee064264030de09d77e19921b7aaa3de9624db26fc8ed"} Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.785150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b","Type":"ContainerStarted","Data":"701c72f90662102ea48b15999c8701fa7e561adda94efdb8c67c617ac73bf154"} Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.786679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a8bc38a-47a7-4623-bf64-ac77d137ceb4","Type":"ContainerStarted","Data":"8c5f2185a49994a496fa796a880a402c1e04d6c286f3c5a87833ae1c0eacb639"} Mar 17 01:33:10 crc kubenswrapper[4735]: I0317 01:33:10.809481 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.809463089 podStartE2EDuration="2.809463089s" podCreationTimestamp="2026-03-17 01:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:33:10.804395681 +0000 UTC m=+1416.436628669" watchObservedRunningTime="2026-03-17 01:33:10.809463089 +0000 UTC m=+1416.441696067" Mar 17 01:33:11 crc kubenswrapper[4735]: I0317 01:33:11.096732 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5093ba85-f6f4-4937-b6c2-f9b06f712145" path="/var/lib/kubelet/pods/5093ba85-f6f4-4937-b6c2-f9b06f712145/volumes" Mar 17 01:33:11 crc kubenswrapper[4735]: I0317 01:33:11.802751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a8bc38a-47a7-4623-bf64-ac77d137ceb4","Type":"ContainerStarted","Data":"43a45b0aa2a88f89a559e11abb5aa2c03b29944501cc9b32bc7cb012775cad39"} Mar 17 01:33:11 crc kubenswrapper[4735]: I0317 01:33:11.802805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a8bc38a-47a7-4623-bf64-ac77d137ceb4","Type":"ContainerStarted","Data":"73fa3b2ef2fd9111a5278902c78844e6571179b621f965bca80b3bf205219629"} Mar 17 01:33:11 crc kubenswrapper[4735]: I0317 01:33:11.827164 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.827146135 podStartE2EDuration="2.827146135s" podCreationTimestamp="2026-03-17 01:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:33:11.820138462 +0000 UTC m=+1417.452371480" watchObservedRunningTime="2026-03-17 01:33:11.827146135 +0000 UTC m=+1417.459379113" Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.606845 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.607177 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.607217 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.607854 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.607982 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b" gracePeriod=600 Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.829842 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b" exitCode=0 Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.830319 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b"} Mar 17 01:33:12 crc kubenswrapper[4735]: I0317 01:33:12.830713 4735 scope.go:117] "RemoveContainer" containerID="f3cc1abd0b398fcd26237f6714e9b38efa46867f6f292d0e7478cf3af5b13d4a" Mar 17 01:33:13 crc kubenswrapper[4735]: I0317 01:33:13.847114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba"} Mar 17 01:33:14 crc kubenswrapper[4735]: I0317 01:33:14.191519 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 01:33:17 crc kubenswrapper[4735]: I0317 01:33:17.068963 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:33:17 crc kubenswrapper[4735]: I0317 01:33:17.071358 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 01:33:18 crc kubenswrapper[4735]: I0317 01:33:18.091173 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 01:33:18 crc kubenswrapper[4735]: I0317 01:33:18.091233 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:33:19 crc kubenswrapper[4735]: I0317 01:33:19.192146 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 01:33:19 crc kubenswrapper[4735]: I0317 01:33:19.234749 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 01:33:19 crc kubenswrapper[4735]: I0317 01:33:19.962355 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 01:33:20 crc kubenswrapper[4735]: I0317 01:33:20.184384 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 01:33:20 crc kubenswrapper[4735]: I0317 01:33:20.184767 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 01:33:21 crc kubenswrapper[4735]: I0317 01:33:21.190046 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6a8bc38a-47a7-4623-bf64-ac77d137ceb4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:33:21 crc kubenswrapper[4735]: I0317 01:33:21.195032 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6a8bc38a-47a7-4623-bf64-ac77d137ceb4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 01:33:25 crc kubenswrapper[4735]: I0317 01:33:25.068645 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 01:33:25 crc kubenswrapper[4735]: I0317 01:33:25.069317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 01:33:27 crc kubenswrapper[4735]: I0317 01:33:27.090516 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 01:33:27 crc kubenswrapper[4735]: I0317 01:33:27.090881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 01:33:27 crc kubenswrapper[4735]: I0317 01:33:27.103019 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 01:33:27 crc kubenswrapper[4735]: I0317 01:33:27.103191 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 01:33:28 crc kubenswrapper[4735]: I0317 01:33:28.184412 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 01:33:28 crc kubenswrapper[4735]: I0317 01:33:28.184485 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 01:33:30 crc kubenswrapper[4735]: I0317 01:33:30.193820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 01:33:30 crc kubenswrapper[4735]: I0317 01:33:30.199974 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 01:33:30 crc kubenswrapper[4735]: I0317 01:33:30.201643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 01:33:31 crc kubenswrapper[4735]: I0317 01:33:31.056493 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 01:33:33 crc kubenswrapper[4735]: I0317 01:33:33.058333 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 01:33:44 crc kubenswrapper[4735]: I0317 01:33:44.095829 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:45 crc kubenswrapper[4735]: I0317 01:33:45.700665 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:48 crc kubenswrapper[4735]: I0317 01:33:48.558514 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="rabbitmq" containerID="cri-o://6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6" gracePeriod=604796 Mar 17 01:33:49 crc kubenswrapper[4735]: I0317 01:33:49.550502 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="rabbitmq" containerID="cri-o://b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e" gracePeriod=604797 Mar 17 01:33:54 crc kubenswrapper[4735]: I0317 01:33:54.221567 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 17 01:33:54 crc kubenswrapper[4735]: I0317 01:33:54.795998 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.106738 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.289155 4735 generic.go:334] "Generic (PLEG): container finished" podID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerID="6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6" exitCode=0 Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.289208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerDied","Data":"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6"} Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.289234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb9ecd2f-1a7b-4c45-b722-c88f25b27487","Type":"ContainerDied","Data":"09406213dee28daa5ca7c222fa5a3305f77b41006592b35ab4ddc9c388636962"} Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.289274 4735 scope.go:117] "RemoveContainer" containerID="6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.289423 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306677 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306758 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306921 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq6dn\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.306973 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.307000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.307032 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.307051 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.307091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.307155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf\") pod \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\" (UID: \"cb9ecd2f-1a7b-4c45-b722-c88f25b27487\") " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.315737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.316190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.320636 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.322142 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.331011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.331096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info" (OuterVolumeSpecName: "pod-info") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.331172 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.346279 4735 scope.go:117] "RemoveContainer" containerID="d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.346444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn" (OuterVolumeSpecName: "kube-api-access-hq6dn") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "kube-api-access-hq6dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.403773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data" (OuterVolumeSpecName: "config-data") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412459 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412495 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412505 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412527 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412537 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq6dn\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-kube-api-access-hq6dn\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412546 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412555 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412563 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.412573 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.433068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf" (OuterVolumeSpecName: "server-conf") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.447074 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.472102 4735 scope.go:117] "RemoveContainer" containerID="6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6" Mar 17 01:33:55 crc kubenswrapper[4735]: E0317 01:33:55.473437 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6\": container with ID starting with 6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6 not found: ID does not exist" containerID="6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.473466 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6"} err="failed to get container status \"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6\": rpc error: code = NotFound desc = could not find container \"6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6\": container with ID starting with 6273221f77c4b0091df187de38b784c49100c49be7bb5bf69ce94a004720c5e6 not found: ID does not exist" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.473484 4735 scope.go:117] "RemoveContainer" containerID="d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423" Mar 17 01:33:55 crc kubenswrapper[4735]: E0317 01:33:55.473848 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423\": container with ID starting with d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423 not found: ID does not exist" containerID="d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.473901 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423"} err="failed to get container status \"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423\": rpc error: code = NotFound desc = could not find container \"d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423\": container with ID starting with d61782d3dfef35486ac069842783961b1659603f4fd81f905b2e09b5610d2423 not found: ID does not exist" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.503590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cb9ecd2f-1a7b-4c45-b722-c88f25b27487" (UID: "cb9ecd2f-1a7b-4c45-b722-c88f25b27487"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.517041 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.517073 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.517084 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb9ecd2f-1a7b-4c45-b722-c88f25b27487-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.620367 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.631940 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.670062 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:55 crc kubenswrapper[4735]: E0317 01:33:55.672190 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="rabbitmq" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.672215 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="rabbitmq" Mar 17 01:33:55 crc kubenswrapper[4735]: E0317 01:33:55.672232 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="setup-container" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.672239 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="setup-container" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.672528 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" containerName="rabbitmq" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.675875 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.696659 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.703431 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.703635 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.703665 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.703748 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.704009 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.704030 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5slxq" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.721305 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822246 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822327 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-config-data\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822413 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822447 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7t8\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-kube-api-access-zz7t8\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.822467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-config-data\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7t8\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-kube-api-access-zz7t8\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.923807 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.924983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.925119 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.927214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.927726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.927811 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.928264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-config-data\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.933578 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.943532 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.955723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.957728 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:55 crc kubenswrapper[4735]: I0317 01:33:55.987556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7t8\" (UniqueName: \"kubernetes.io/projected/477a21f3-fdbe-42ea-bcd9-05fc4dca6a52-kube-api-access-zz7t8\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.068689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52\") " pod="openstack/rabbitmq-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.228903 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.295193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.317775 4735 generic.go:334] "Generic (PLEG): container finished" podID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerID="b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e" exitCode=0 Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.317895 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerDied","Data":"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e"} Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.317929 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e","Type":"ContainerDied","Data":"cdd76d1b426c3fbc6f598e8cb35649494b662f5888648ef503cc97d391974fd8"} Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.317947 4735 scope.go:117] "RemoveContainer" containerID="b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.318057 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362309 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362400 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362421 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362459 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwrf\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362668 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.362701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie\") pod \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\" (UID: \"a4bda0b8-ed44-4576-a404-b25cc7f8ea6e\") " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.363522 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.368473 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.370280 4735 scope.go:117] "RemoveContainer" containerID="7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.370321 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info" (OuterVolumeSpecName: "pod-info") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.370977 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.372448 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.387052 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf" (OuterVolumeSpecName: "kube-api-access-kmwrf") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "kube-api-access-kmwrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.387153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.389032 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.412039 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data" (OuterVolumeSpecName: "config-data") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.464249 4735 scope.go:117] "RemoveContainer" containerID="b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e" Mar 17 01:33:56 crc kubenswrapper[4735]: E0317 01:33:56.466786 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e\": container with ID starting with b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e not found: ID does not exist" containerID="b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.466843 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e"} err="failed to get container status \"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e\": rpc error: code = NotFound desc = could not find container \"b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e\": container with ID starting with b52a7a9c753de19b74f8b9d816925916fbee0279d5435da118d0a58782a3c95e not found: ID does not exist" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.466898 4735 scope.go:117] "RemoveContainer" containerID="7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467003 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467026 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467035 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwrf\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-kube-api-access-kmwrf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467054 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467066 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467092 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467102 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467118 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: E0317 01:33:56.467142 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155\": container with ID starting with 7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155 not found: ID does not exist" containerID="7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.467164 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155"} err="failed to get container status \"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155\": rpc error: code = NotFound desc = could not find container \"7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155\": container with ID starting with 7f3f1b9428ddec992ad48b4fe7a2d495757453e503a419457bd66c2b7c287155 not found: ID does not exist" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.474847 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf" (OuterVolumeSpecName: "server-conf") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.538137 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.572024 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.572053 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.602284 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" (UID: "a4bda0b8-ed44-4576-a404-b25cc7f8ea6e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.656016 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.668601 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.676014 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.690411 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:56 crc kubenswrapper[4735]: E0317 01:33:56.690785 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="rabbitmq" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.690801 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="rabbitmq" Mar 17 01:33:56 crc kubenswrapper[4735]: E0317 01:33:56.690816 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="setup-container" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.690823 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="setup-container" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.691017 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" containerName="rabbitmq" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.691910 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.698435 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.698479 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.698770 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.698829 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.698923 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.699158 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.700055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lng5c" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.711498 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777636 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5023c673-f338-49b0-b6ef-9bf53abfdb28-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcv55\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-kube-api-access-xcv55\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.777971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5023c673-f338-49b0-b6ef-9bf53abfdb28-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.778000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.870293 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.879234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.879300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.879670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.879945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.880973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881074 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5023c673-f338-49b0-b6ef-9bf53abfdb28-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881103 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcv55\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-kube-api-access-xcv55\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5023c673-f338-49b0-b6ef-9bf53abfdb28-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.881665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.882054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.882055 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.882392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5023c673-f338-49b0-b6ef-9bf53abfdb28-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.886702 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.888275 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5023c673-f338-49b0-b6ef-9bf53abfdb28-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.889441 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.898542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5023c673-f338-49b0-b6ef-9bf53abfdb28-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.909624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcv55\" (UniqueName: \"kubernetes.io/projected/5023c673-f338-49b0-b6ef-9bf53abfdb28-kube-api-access-xcv55\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:56 crc kubenswrapper[4735]: I0317 01:33:56.929183 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5023c673-f338-49b0-b6ef-9bf53abfdb28\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:57 crc kubenswrapper[4735]: I0317 01:33:57.015245 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:33:57 crc kubenswrapper[4735]: I0317 01:33:57.083033 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4bda0b8-ed44-4576-a404-b25cc7f8ea6e" path="/var/lib/kubelet/pods/a4bda0b8-ed44-4576-a404-b25cc7f8ea6e/volumes" Mar 17 01:33:57 crc kubenswrapper[4735]: I0317 01:33:57.083779 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9ecd2f-1a7b-4c45-b722-c88f25b27487" path="/var/lib/kubelet/pods/cb9ecd2f-1a7b-4c45-b722-c88f25b27487/volumes" Mar 17 01:33:57 crc kubenswrapper[4735]: I0317 01:33:57.348012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52","Type":"ContainerStarted","Data":"67815082cf5f894cd4e5e75d4d01dd69e444b94bd1652924e5325bcd93264715"} Mar 17 01:33:57 crc kubenswrapper[4735]: W0317 01:33:57.465169 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5023c673_f338_49b0_b6ef_9bf53abfdb28.slice/crio-f249b3e3ac517cce7662283d7619bafa295ae9b165f1e869bacf91f7d929269a WatchSource:0}: Error finding container f249b3e3ac517cce7662283d7619bafa295ae9b165f1e869bacf91f7d929269a: Status 404 returned error can't find the container with id f249b3e3ac517cce7662283d7619bafa295ae9b165f1e869bacf91f7d929269a Mar 17 01:33:57 crc kubenswrapper[4735]: I0317 01:33:57.476422 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 01:33:58 crc kubenswrapper[4735]: I0317 01:33:58.359911 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5023c673-f338-49b0-b6ef-9bf53abfdb28","Type":"ContainerStarted","Data":"f249b3e3ac517cce7662283d7619bafa295ae9b165f1e869bacf91f7d929269a"} Mar 17 01:33:58 crc kubenswrapper[4735]: I0317 01:33:58.943456 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:33:58 crc kubenswrapper[4735]: I0317 01:33:58.945109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:58 crc kubenswrapper[4735]: I0317 01:33:58.946674 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 17 01:33:58 crc kubenswrapper[4735]: I0317 01:33:58.965035 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44dd\" (UniqueName: \"kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019536 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.019634 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.121968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44dd\" (UniqueName: \"kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.122964 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.123488 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.123997 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.124484 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.125941 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.126418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.139408 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44dd\" (UniqueName: \"kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd\") pod \"dnsmasq-dns-8654f7787c-gjsb2\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.265845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.385889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5023c673-f338-49b0-b6ef-9bf53abfdb28","Type":"ContainerStarted","Data":"9f8fd15de164bda4618b67778170c66182a3dacdbc9ce1b47f3bdc5feb2819e2"} Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.388226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52","Type":"ContainerStarted","Data":"662e765a29917a365e4b9c3598a7485a4095605acdfa1b3c7c036cb28c3ff21e"} Mar 17 01:33:59 crc kubenswrapper[4735]: I0317 01:33:59.753071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:33:59 crc kubenswrapper[4735]: W0317 01:33:59.755910 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49017d12_20c8_4df4_88e8_8da4ed4d106e.slice/crio-bfc215087937bb0370f867e4dfc5cf44bcbce960d9fcf1c08285a54b32ac461b WatchSource:0}: Error finding container bfc215087937bb0370f867e4dfc5cf44bcbce960d9fcf1c08285a54b32ac461b: Status 404 returned error can't find the container with id bfc215087937bb0370f867e4dfc5cf44bcbce960d9fcf1c08285a54b32ac461b Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.133260 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561854-4vwb5"] Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.134713 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.137949 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.140334 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.141119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.145028 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-4vwb5"] Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.266503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jmh\" (UniqueName: \"kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh\") pod \"auto-csr-approver-29561854-4vwb5\" (UID: \"25d2c860-7702-4c17-b763-8a2da103ca6d\") " pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.368653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jmh\" (UniqueName: \"kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh\") pod \"auto-csr-approver-29561854-4vwb5\" (UID: \"25d2c860-7702-4c17-b763-8a2da103ca6d\") " pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.384278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jmh\" (UniqueName: \"kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh\") pod \"auto-csr-approver-29561854-4vwb5\" (UID: \"25d2c860-7702-4c17-b763-8a2da103ca6d\") " pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.400744 4735 generic.go:334] "Generic (PLEG): container finished" podID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerID="03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5" exitCode=0 Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.400881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" event={"ID":"49017d12-20c8-4df4-88e8-8da4ed4d106e","Type":"ContainerDied","Data":"03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5"} Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.400931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" event={"ID":"49017d12-20c8-4df4-88e8-8da4ed4d106e","Type":"ContainerStarted","Data":"bfc215087937bb0370f867e4dfc5cf44bcbce960d9fcf1c08285a54b32ac461b"} Mar 17 01:34:00 crc kubenswrapper[4735]: I0317 01:34:00.494401 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:01 crc kubenswrapper[4735]: I0317 01:34:01.105107 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-4vwb5"] Mar 17 01:34:01 crc kubenswrapper[4735]: W0317 01:34:01.115768 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d2c860_7702_4c17_b763_8a2da103ca6d.slice/crio-7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c WatchSource:0}: Error finding container 7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c: Status 404 returned error can't find the container with id 7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c Mar 17 01:34:01 crc kubenswrapper[4735]: I0317 01:34:01.411341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" event={"ID":"49017d12-20c8-4df4-88e8-8da4ed4d106e","Type":"ContainerStarted","Data":"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5"} Mar 17 01:34:01 crc kubenswrapper[4735]: I0317 01:34:01.411529 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:34:01 crc kubenswrapper[4735]: I0317 01:34:01.413111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" event={"ID":"25d2c860-7702-4c17-b763-8a2da103ca6d","Type":"ContainerStarted","Data":"7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c"} Mar 17 01:34:01 crc kubenswrapper[4735]: I0317 01:34:01.442074 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" podStartSLOduration=3.442058301 podStartE2EDuration="3.442058301s" podCreationTimestamp="2026-03-17 01:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:34:01.435398186 +0000 UTC m=+1467.067631174" watchObservedRunningTime="2026-03-17 01:34:01.442058301 +0000 UTC m=+1467.074291279" Mar 17 01:34:03 crc kubenswrapper[4735]: I0317 01:34:03.483012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" event={"ID":"25d2c860-7702-4c17-b763-8a2da103ca6d","Type":"ContainerStarted","Data":"74f69d396b063e0cc21e6cb1dd81e367276d64596eb0b8ad180810b3b0140457"} Mar 17 01:34:03 crc kubenswrapper[4735]: I0317 01:34:03.518699 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" podStartSLOduration=2.340382594 podStartE2EDuration="3.518680533s" podCreationTimestamp="2026-03-17 01:34:00 +0000 UTC" firstStartedPulling="2026-03-17 01:34:01.118706245 +0000 UTC m=+1466.750939223" lastFinishedPulling="2026-03-17 01:34:02.297004174 +0000 UTC m=+1467.929237162" observedRunningTime="2026-03-17 01:34:03.507582515 +0000 UTC m=+1469.139815503" watchObservedRunningTime="2026-03-17 01:34:03.518680533 +0000 UTC m=+1469.150913521" Mar 17 01:34:04 crc kubenswrapper[4735]: I0317 01:34:04.497238 4735 generic.go:334] "Generic (PLEG): container finished" podID="25d2c860-7702-4c17-b763-8a2da103ca6d" containerID="74f69d396b063e0cc21e6cb1dd81e367276d64596eb0b8ad180810b3b0140457" exitCode=0 Mar 17 01:34:04 crc kubenswrapper[4735]: I0317 01:34:04.497378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" event={"ID":"25d2c860-7702-4c17-b763-8a2da103ca6d","Type":"ContainerDied","Data":"74f69d396b063e0cc21e6cb1dd81e367276d64596eb0b8ad180810b3b0140457"} Mar 17 01:34:05 crc kubenswrapper[4735]: I0317 01:34:05.947504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:05 crc kubenswrapper[4735]: I0317 01:34:05.995749 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25jmh\" (UniqueName: \"kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh\") pod \"25d2c860-7702-4c17-b763-8a2da103ca6d\" (UID: \"25d2c860-7702-4c17-b763-8a2da103ca6d\") " Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.002510 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh" (OuterVolumeSpecName: "kube-api-access-25jmh") pod "25d2c860-7702-4c17-b763-8a2da103ca6d" (UID: "25d2c860-7702-4c17-b763-8a2da103ca6d"). InnerVolumeSpecName "kube-api-access-25jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.097812 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25jmh\" (UniqueName: \"kubernetes.io/projected/25d2c860-7702-4c17-b763-8a2da103ca6d-kube-api-access-25jmh\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.554694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" event={"ID":"25d2c860-7702-4c17-b763-8a2da103ca6d","Type":"ContainerDied","Data":"7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c"} Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.554762 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b26007c64fc32cd45a85b8aa6604cdff9e398078ec0dfdd4e424eab2144366c" Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.554846 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-4vwb5" Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.618390 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-4xvkv"] Mar 17 01:34:06 crc kubenswrapper[4735]: I0317 01:34:06.629461 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-4xvkv"] Mar 17 01:34:07 crc kubenswrapper[4735]: I0317 01:34:07.094757 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0e19e4-fcba-4981-ad27-32c13a744be5" path="/var/lib/kubelet/pods/9b0e19e4-fcba-4981-ad27-32c13a744be5/volumes" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.268048 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.422429 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.422664 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="dnsmasq-dns" containerID="cri-o://792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7" gracePeriod=10 Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.690445 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745d4b7977-nmmzk"] Mar 17 01:34:09 crc kubenswrapper[4735]: E0317 01:34:09.690792 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d2c860-7702-4c17-b763-8a2da103ca6d" containerName="oc" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.690808 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d2c860-7702-4c17-b763-8a2da103ca6d" containerName="oc" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.691014 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d2c860-7702-4c17-b763-8a2da103ca6d" containerName="oc" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.692620 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.723421 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745d4b7977-nmmzk"] Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-svc\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805623 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-openstack-edpm-ipam\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-swift-storage-0\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2xs\" (UniqueName: \"kubernetes.io/projected/274120fc-6c2e-4443-838e-3657b2f4eeef-kube-api-access-nw2xs\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-sb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-config\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.805950 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-nb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.907958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-config\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-nb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-svc\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908369 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-openstack-edpm-ipam\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-swift-storage-0\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908462 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2xs\" (UniqueName: \"kubernetes.io/projected/274120fc-6c2e-4443-838e-3657b2f4eeef-kube-api-access-nw2xs\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908479 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-sb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.908774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-config\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.909214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-sb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.909368 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-openstack-edpm-ipam\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.909713 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-ovsdbserver-nb\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.909780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-swift-storage-0\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.909905 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274120fc-6c2e-4443-838e-3657b2f4eeef-dns-svc\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:09 crc kubenswrapper[4735]: I0317 01:34:09.937399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2xs\" (UniqueName: \"kubernetes.io/projected/274120fc-6c2e-4443-838e-3657b2f4eeef-kube-api-access-nw2xs\") pod \"dnsmasq-dns-745d4b7977-nmmzk\" (UID: \"274120fc-6c2e-4443-838e-3657b2f4eeef\") " pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.008373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.090119 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rwk7\" (UniqueName: \"kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214556 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214665 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.214811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb\") pod \"61734c18-4914-46f6-8994-0801068b497b\" (UID: \"61734c18-4914-46f6-8994-0801068b497b\") " Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.234808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7" (OuterVolumeSpecName: "kube-api-access-4rwk7") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "kube-api-access-4rwk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.301410 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.311803 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.312045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.315537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config" (OuterVolumeSpecName: "config") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.317377 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.320134 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.320289 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rwk7\" (UniqueName: \"kubernetes.io/projected/61734c18-4914-46f6-8994-0801068b497b-kube-api-access-4rwk7\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.320353 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.327970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61734c18-4914-46f6-8994-0801068b497b" (UID: "61734c18-4914-46f6-8994-0801068b497b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.422472 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.422513 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61734c18-4914-46f6-8994-0801068b497b-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.505757 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745d4b7977-nmmzk"] Mar 17 01:34:10 crc kubenswrapper[4735]: W0317 01:34:10.507435 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274120fc_6c2e_4443_838e_3657b2f4eeef.slice/crio-6651e515852f64d9ce5c4ac442d1dbca49411c282aff531de69dba8135fb24c1 WatchSource:0}: Error finding container 6651e515852f64d9ce5c4ac442d1dbca49411c282aff531de69dba8135fb24c1: Status 404 returned error can't find the container with id 6651e515852f64d9ce5c4ac442d1dbca49411c282aff531de69dba8135fb24c1 Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.610398 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" event={"ID":"274120fc-6c2e-4443-838e-3657b2f4eeef","Type":"ContainerStarted","Data":"6651e515852f64d9ce5c4ac442d1dbca49411c282aff531de69dba8135fb24c1"} Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.611795 4735 generic.go:334] "Generic (PLEG): container finished" podID="61734c18-4914-46f6-8994-0801068b497b" containerID="792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7" exitCode=0 Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.611831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" event={"ID":"61734c18-4914-46f6-8994-0801068b497b","Type":"ContainerDied","Data":"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7"} Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.611847 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.611874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77cfbd7c-pjq9q" event={"ID":"61734c18-4914-46f6-8994-0801068b497b","Type":"ContainerDied","Data":"fd7f0c7f56a3b13dd837274af05c9ae9029b01c82ef5ca125d564b3d45d9a9c4"} Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.611894 4735 scope.go:117] "RemoveContainer" containerID="792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.644130 4735 scope.go:117] "RemoveContainer" containerID="4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.715917 4735 scope.go:117] "RemoveContainer" containerID="792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7" Mar 17 01:34:10 crc kubenswrapper[4735]: E0317 01:34:10.716646 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7\": container with ID starting with 792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7 not found: ID does not exist" containerID="792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.716684 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7"} err="failed to get container status \"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7\": rpc error: code = NotFound desc = could not find container \"792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7\": container with ID starting with 792dc317d74385495c8072b50f24b9eb98bdb5ee9e9cc306bfffc10579fda0a7 not found: ID does not exist" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.716709 4735 scope.go:117] "RemoveContainer" containerID="4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac" Mar 17 01:34:10 crc kubenswrapper[4735]: E0317 01:34:10.717140 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac\": container with ID starting with 4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac not found: ID does not exist" containerID="4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.717192 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac"} err="failed to get container status \"4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac\": rpc error: code = NotFound desc = could not find container \"4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac\": container with ID starting with 4a4efcd7bcb3132090821b9588d342cf07802a3f00b7fa129de39b5597d1fdac not found: ID does not exist" Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.723686 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:34:10 crc kubenswrapper[4735]: I0317 01:34:10.738233 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f77cfbd7c-pjq9q"] Mar 17 01:34:11 crc kubenswrapper[4735]: I0317 01:34:11.083097 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61734c18-4914-46f6-8994-0801068b497b" path="/var/lib/kubelet/pods/61734c18-4914-46f6-8994-0801068b497b/volumes" Mar 17 01:34:11 crc kubenswrapper[4735]: I0317 01:34:11.622535 4735 generic.go:334] "Generic (PLEG): container finished" podID="274120fc-6c2e-4443-838e-3657b2f4eeef" containerID="31e5cb8a539ce0669a4011f10375692d1b09af5871d6207bc942dfbf32f1b20a" exitCode=0 Mar 17 01:34:11 crc kubenswrapper[4735]: I0317 01:34:11.623670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" event={"ID":"274120fc-6c2e-4443-838e-3657b2f4eeef","Type":"ContainerDied","Data":"31e5cb8a539ce0669a4011f10375692d1b09af5871d6207bc942dfbf32f1b20a"} Mar 17 01:34:12 crc kubenswrapper[4735]: I0317 01:34:12.664463 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" event={"ID":"274120fc-6c2e-4443-838e-3657b2f4eeef","Type":"ContainerStarted","Data":"b6b42f5d03b0d12356a9b3831deaf9825aaef192edf888895beff75b13e3e6b8"} Mar 17 01:34:12 crc kubenswrapper[4735]: I0317 01:34:12.666340 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:12 crc kubenswrapper[4735]: I0317 01:34:12.700128 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" podStartSLOduration=3.7001018439999998 podStartE2EDuration="3.700101844s" podCreationTimestamp="2026-03-17 01:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:34:12.686311842 +0000 UTC m=+1478.318544840" watchObservedRunningTime="2026-03-17 01:34:12.700101844 +0000 UTC m=+1478.332334822" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.011195 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745d4b7977-nmmzk" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.088050 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.088480 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="dnsmasq-dns" containerID="cri-o://61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5" gracePeriod=10 Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.698454 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.758491 4735 generic.go:334] "Generic (PLEG): container finished" podID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerID="61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5" exitCode=0 Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.758535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" event={"ID":"49017d12-20c8-4df4-88e8-8da4ed4d106e","Type":"ContainerDied","Data":"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5"} Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.758560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" event={"ID":"49017d12-20c8-4df4-88e8-8da4ed4d106e","Type":"ContainerDied","Data":"bfc215087937bb0370f867e4dfc5cf44bcbce960d9fcf1c08285a54b32ac461b"} Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.758577 4735 scope.go:117] "RemoveContainer" containerID="61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.758705 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8654f7787c-gjsb2" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.779795 4735 scope.go:117] "RemoveContainer" containerID="03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.815734 4735 scope.go:117] "RemoveContainer" containerID="61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5" Mar 17 01:34:20 crc kubenswrapper[4735]: E0317 01:34:20.816222 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5\": container with ID starting with 61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5 not found: ID does not exist" containerID="61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.816270 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5"} err="failed to get container status \"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5\": rpc error: code = NotFound desc = could not find container \"61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5\": container with ID starting with 61653a335b3f44f804eb088c22820f22d2432c1f280936b5a0b14e2e76da1de5 not found: ID does not exist" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.816295 4735 scope.go:117] "RemoveContainer" containerID="03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5" Mar 17 01:34:20 crc kubenswrapper[4735]: E0317 01:34:20.816675 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5\": container with ID starting with 03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5 not found: ID does not exist" containerID="03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.816715 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5"} err="failed to get container status \"03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5\": rpc error: code = NotFound desc = could not find container \"03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5\": container with ID starting with 03b0afa825bd07e051c955c96847c67c5ca1cdc6704c0cb42fc74d70b43804c5 not found: ID does not exist" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846352 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d44dd\" (UniqueName: \"kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846576 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846641 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.846722 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam\") pod \"49017d12-20c8-4df4-88e8-8da4ed4d106e\" (UID: \"49017d12-20c8-4df4-88e8-8da4ed4d106e\") " Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.876362 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd" (OuterVolumeSpecName: "kube-api-access-d44dd") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "kube-api-access-d44dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.899934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.906021 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.908504 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config" (OuterVolumeSpecName: "config") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.921977 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.925073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.927610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49017d12-20c8-4df4-88e8-8da4ed4d106e" (UID: "49017d12-20c8-4df4-88e8-8da4ed4d106e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949190 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d44dd\" (UniqueName: \"kubernetes.io/projected/49017d12-20c8-4df4-88e8-8da4ed4d106e-kube-api-access-d44dd\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949231 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949246 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949260 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949271 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949282 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:20 crc kubenswrapper[4735]: I0317 01:34:20.949297 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49017d12-20c8-4df4-88e8-8da4ed4d106e-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:21 crc kubenswrapper[4735]: I0317 01:34:21.088166 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:34:21 crc kubenswrapper[4735]: I0317 01:34:21.102298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8654f7787c-gjsb2"] Mar 17 01:34:23 crc kubenswrapper[4735]: I0317 01:34:23.086565 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" path="/var/lib/kubelet/pods/49017d12-20c8-4df4-88e8-8da4ed4d106e/volumes" Mar 17 01:34:30 crc kubenswrapper[4735]: I0317 01:34:30.899736 4735 generic.go:334] "Generic (PLEG): container finished" podID="5023c673-f338-49b0-b6ef-9bf53abfdb28" containerID="9f8fd15de164bda4618b67778170c66182a3dacdbc9ce1b47f3bdc5feb2819e2" exitCode=0 Mar 17 01:34:30 crc kubenswrapper[4735]: I0317 01:34:30.899792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5023c673-f338-49b0-b6ef-9bf53abfdb28","Type":"ContainerDied","Data":"9f8fd15de164bda4618b67778170c66182a3dacdbc9ce1b47f3bdc5feb2819e2"} Mar 17 01:34:30 crc kubenswrapper[4735]: I0317 01:34:30.902733 4735 generic.go:334] "Generic (PLEG): container finished" podID="477a21f3-fdbe-42ea-bcd9-05fc4dca6a52" containerID="662e765a29917a365e4b9c3598a7485a4095605acdfa1b3c7c036cb28c3ff21e" exitCode=0 Mar 17 01:34:30 crc kubenswrapper[4735]: I0317 01:34:30.902771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52","Type":"ContainerDied","Data":"662e765a29917a365e4b9c3598a7485a4095605acdfa1b3c7c036cb28c3ff21e"} Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.926725 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5023c673-f338-49b0-b6ef-9bf53abfdb28","Type":"ContainerStarted","Data":"ea14c9c4167aeee3f2d26d70116165c8cf4c2586e85d1853c53ec1482b48009f"} Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.927725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.933323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"477a21f3-fdbe-42ea-bcd9-05fc4dca6a52","Type":"ContainerStarted","Data":"126f3c2fa250dcc5c108d39a70b526291fb7a2acde726c014fa0a6c87c6b1b79"} Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.933950 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.958022 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.958004351 podStartE2EDuration="35.958004351s" podCreationTimestamp="2026-03-17 01:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:34:31.954339507 +0000 UTC m=+1497.586572485" watchObservedRunningTime="2026-03-17 01:34:31.958004351 +0000 UTC m=+1497.590237329" Mar 17 01:34:31 crc kubenswrapper[4735]: I0317 01:34:31.989848 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.989831573000004 podStartE2EDuration="36.989831573s" podCreationTimestamp="2026-03-17 01:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:34:31.987516299 +0000 UTC m=+1497.619749277" watchObservedRunningTime="2026-03-17 01:34:31.989831573 +0000 UTC m=+1497.622064551" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.237438 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn"] Mar 17 01:34:38 crc kubenswrapper[4735]: E0317 01:34:38.238255 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238267 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: E0317 01:34:38.238282 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="init" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238307 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="init" Mar 17 01:34:38 crc kubenswrapper[4735]: E0317 01:34:38.238323 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238328 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: E0317 01:34:38.238348 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="init" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238353 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="init" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238522 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="49017d12-20c8-4df4-88e8-8da4ed4d106e" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.238535 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="61734c18-4914-46f6-8994-0801068b497b" containerName="dnsmasq-dns" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.239206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.242670 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.243081 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.243203 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.243209 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.258194 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn"] Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.441838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.441971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9x4w\" (UniqueName: \"kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.442043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.442082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.543317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9x4w\" (UniqueName: \"kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.543406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.543447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.543496 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.548833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.549940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.550722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.582148 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9x4w\" (UniqueName: \"kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:38 crc kubenswrapper[4735]: I0317 01:34:38.857654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:34:39 crc kubenswrapper[4735]: I0317 01:34:39.735379 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn"] Mar 17 01:34:40 crc kubenswrapper[4735]: I0317 01:34:40.006691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" event={"ID":"9cc576b2-b1b2-426f-bec9-ebcd2ad12150","Type":"ContainerStarted","Data":"6a649e6d717b566c2ca39e3fc70f26db45065e166414dc88f0972e24db3007c8"} Mar 17 01:34:46 crc kubenswrapper[4735]: I0317 01:34:46.298030 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 01:34:47 crc kubenswrapper[4735]: I0317 01:34:47.021785 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 01:34:48 crc kubenswrapper[4735]: I0317 01:34:48.223170 4735 scope.go:117] "RemoveContainer" containerID="97bb2d235347544a900063a9c3ad48b344862bc6348b84beba23bd1304ca3278" Mar 17 01:34:51 crc kubenswrapper[4735]: I0317 01:34:51.171474 4735 scope.go:117] "RemoveContainer" containerID="7e5dafcede5736a2446e1649ad54298b053c86bcbf9d9fddcdec87aaf10e00e2" Mar 17 01:34:51 crc kubenswrapper[4735]: I0317 01:34:51.243496 4735 scope.go:117] "RemoveContainer" containerID="25ff3155c58eacfa3b7a0810dcefee81ac46ae8c2e4337c4492a17e869a9c7a2" Mar 17 01:34:52 crc kubenswrapper[4735]: I0317 01:34:52.129379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" event={"ID":"9cc576b2-b1b2-426f-bec9-ebcd2ad12150","Type":"ContainerStarted","Data":"5af277217559a5fb7472bd5fe3a9bd06bf77f892ab7610670c7dd334ebe4a154"} Mar 17 01:34:52 crc kubenswrapper[4735]: I0317 01:34:52.159027 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" podStartSLOduration=2.65786757 podStartE2EDuration="14.159004047s" podCreationTimestamp="2026-03-17 01:34:38 +0000 UTC" firstStartedPulling="2026-03-17 01:34:39.744650189 +0000 UTC m=+1505.376883167" lastFinishedPulling="2026-03-17 01:34:51.245786656 +0000 UTC m=+1516.878019644" observedRunningTime="2026-03-17 01:34:52.156065439 +0000 UTC m=+1517.788298457" watchObservedRunningTime="2026-03-17 01:34:52.159004047 +0000 UTC m=+1517.791237045" Mar 17 01:35:03 crc kubenswrapper[4735]: I0317 01:35:03.245931 4735 generic.go:334] "Generic (PLEG): container finished" podID="9cc576b2-b1b2-426f-bec9-ebcd2ad12150" containerID="5af277217559a5fb7472bd5fe3a9bd06bf77f892ab7610670c7dd334ebe4a154" exitCode=0 Mar 17 01:35:03 crc kubenswrapper[4735]: I0317 01:35:03.246161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" event={"ID":"9cc576b2-b1b2-426f-bec9-ebcd2ad12150","Type":"ContainerDied","Data":"5af277217559a5fb7472bd5fe3a9bd06bf77f892ab7610670c7dd334ebe4a154"} Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.733145 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.805963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle\") pod \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.806036 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory\") pod \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.806217 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam\") pod \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.806470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9x4w\" (UniqueName: \"kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w\") pod \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\" (UID: \"9cc576b2-b1b2-426f-bec9-ebcd2ad12150\") " Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.815239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9cc576b2-b1b2-426f-bec9-ebcd2ad12150" (UID: "9cc576b2-b1b2-426f-bec9-ebcd2ad12150"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.821152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w" (OuterVolumeSpecName: "kube-api-access-s9x4w") pod "9cc576b2-b1b2-426f-bec9-ebcd2ad12150" (UID: "9cc576b2-b1b2-426f-bec9-ebcd2ad12150"). InnerVolumeSpecName "kube-api-access-s9x4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.842982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cc576b2-b1b2-426f-bec9-ebcd2ad12150" (UID: "9cc576b2-b1b2-426f-bec9-ebcd2ad12150"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.861444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory" (OuterVolumeSpecName: "inventory") pod "9cc576b2-b1b2-426f-bec9-ebcd2ad12150" (UID: "9cc576b2-b1b2-426f-bec9-ebcd2ad12150"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.909029 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9x4w\" (UniqueName: \"kubernetes.io/projected/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-kube-api-access-s9x4w\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.909256 4735 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.909344 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:04 crc kubenswrapper[4735]: I0317 01:35:04.909441 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc576b2-b1b2-426f-bec9-ebcd2ad12150-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.268163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" event={"ID":"9cc576b2-b1b2-426f-bec9-ebcd2ad12150","Type":"ContainerDied","Data":"6a649e6d717b566c2ca39e3fc70f26db45065e166414dc88f0972e24db3007c8"} Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.268238 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.268206 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a649e6d717b566c2ca39e3fc70f26db45065e166414dc88f0972e24db3007c8" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.361111 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc"] Mar 17 01:35:05 crc kubenswrapper[4735]: E0317 01:35:05.361551 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc576b2-b1b2-426f-bec9-ebcd2ad12150" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.361571 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc576b2-b1b2-426f-bec9-ebcd2ad12150" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.361831 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc576b2-b1b2-426f-bec9-ebcd2ad12150" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.362554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.366567 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.366820 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.367010 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.367643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.391155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc"] Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.549225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fn99\" (UniqueName: \"kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.549841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.549919 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.651840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fn99\" (UniqueName: \"kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.651912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.651949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.656777 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.664714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.668277 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fn99\" (UniqueName: \"kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t9fc\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:05 crc kubenswrapper[4735]: I0317 01:35:05.687043 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:06 crc kubenswrapper[4735]: I0317 01:35:06.216269 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc"] Mar 17 01:35:06 crc kubenswrapper[4735]: I0317 01:35:06.277593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" event={"ID":"afc10dcf-31e4-477c-8f5c-310c6da60988","Type":"ContainerStarted","Data":"036652a30e85f5d4054fa8bd5bebe2e7071d141a676352a0e3c96c96e3bbad5b"} Mar 17 01:35:07 crc kubenswrapper[4735]: I0317 01:35:07.308176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" event={"ID":"afc10dcf-31e4-477c-8f5c-310c6da60988","Type":"ContainerStarted","Data":"e060a998d9420fc94fb46236793a5126594ff7c00c2cb5d014bd0548b66a86c8"} Mar 17 01:35:07 crc kubenswrapper[4735]: I0317 01:35:07.371771 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" podStartSLOduration=1.9545338970000001 podStartE2EDuration="2.371748259s" podCreationTimestamp="2026-03-17 01:35:05 +0000 UTC" firstStartedPulling="2026-03-17 01:35:06.226389689 +0000 UTC m=+1531.858622677" lastFinishedPulling="2026-03-17 01:35:06.643604021 +0000 UTC m=+1532.275837039" observedRunningTime="2026-03-17 01:35:07.356460253 +0000 UTC m=+1532.988693231" watchObservedRunningTime="2026-03-17 01:35:07.371748259 +0000 UTC m=+1533.003981247" Mar 17 01:35:10 crc kubenswrapper[4735]: I0317 01:35:10.339088 4735 generic.go:334] "Generic (PLEG): container finished" podID="afc10dcf-31e4-477c-8f5c-310c6da60988" containerID="e060a998d9420fc94fb46236793a5126594ff7c00c2cb5d014bd0548b66a86c8" exitCode=0 Mar 17 01:35:10 crc kubenswrapper[4735]: I0317 01:35:10.339320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" event={"ID":"afc10dcf-31e4-477c-8f5c-310c6da60988","Type":"ContainerDied","Data":"e060a998d9420fc94fb46236793a5126594ff7c00c2cb5d014bd0548b66a86c8"} Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.606223 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.606906 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.691110 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.793369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory\") pod \"afc10dcf-31e4-477c-8f5c-310c6da60988\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.793718 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam\") pod \"afc10dcf-31e4-477c-8f5c-310c6da60988\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.793865 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fn99\" (UniqueName: \"kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99\") pod \"afc10dcf-31e4-477c-8f5c-310c6da60988\" (UID: \"afc10dcf-31e4-477c-8f5c-310c6da60988\") " Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.804134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99" (OuterVolumeSpecName: "kube-api-access-9fn99") pod "afc10dcf-31e4-477c-8f5c-310c6da60988" (UID: "afc10dcf-31e4-477c-8f5c-310c6da60988"). InnerVolumeSpecName "kube-api-access-9fn99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.828290 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory" (OuterVolumeSpecName: "inventory") pod "afc10dcf-31e4-477c-8f5c-310c6da60988" (UID: "afc10dcf-31e4-477c-8f5c-310c6da60988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.838091 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "afc10dcf-31e4-477c-8f5c-310c6da60988" (UID: "afc10dcf-31e4-477c-8f5c-310c6da60988"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.896791 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.896846 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afc10dcf-31e4-477c-8f5c-310c6da60988-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:12 crc kubenswrapper[4735]: I0317 01:35:12.896907 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fn99\" (UniqueName: \"kubernetes.io/projected/afc10dcf-31e4-477c-8f5c-310c6da60988-kube-api-access-9fn99\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:13 crc kubenswrapper[4735]: E0317 01:35:13.256119 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafc10dcf_31e4_477c_8f5c_310c6da60988.slice/crio-036652a30e85f5d4054fa8bd5bebe2e7071d141a676352a0e3c96c96e3bbad5b\": RecentStats: unable to find data in memory cache]" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.406077 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.406151 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t9fc" event={"ID":"afc10dcf-31e4-477c-8f5c-310c6da60988","Type":"ContainerDied","Data":"036652a30e85f5d4054fa8bd5bebe2e7071d141a676352a0e3c96c96e3bbad5b"} Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.406487 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="036652a30e85f5d4054fa8bd5bebe2e7071d141a676352a0e3c96c96e3bbad5b" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.872905 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9"] Mar 17 01:35:13 crc kubenswrapper[4735]: E0317 01:35:13.873506 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc10dcf-31e4-477c-8f5c-310c6da60988" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.873525 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc10dcf-31e4-477c-8f5c-310c6da60988" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.873876 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc10dcf-31e4-477c-8f5c-310c6da60988" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.874813 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.882201 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.882253 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.882415 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.882561 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:35:13 crc kubenswrapper[4735]: I0317 01:35:13.910874 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9"] Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.031740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.031936 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.032004 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89s6m\" (UniqueName: \"kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.032126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.133887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.134037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.134092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89s6m\" (UniqueName: \"kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.134163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.140340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.145256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.146322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.154985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89s6m\" (UniqueName: \"kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.194521 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:35:14 crc kubenswrapper[4735]: I0317 01:35:14.810807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9"] Mar 17 01:35:15 crc kubenswrapper[4735]: I0317 01:35:15.429445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" event={"ID":"33665f4a-8504-4b35-9850-d2a567b93418","Type":"ContainerStarted","Data":"2b4c31010656f2d522f50f7b6a0e54a75b037343d567a04fd7178b6f84d5208e"} Mar 17 01:35:16 crc kubenswrapper[4735]: I0317 01:35:16.438936 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" event={"ID":"33665f4a-8504-4b35-9850-d2a567b93418","Type":"ContainerStarted","Data":"e01f1632860b0689f38daf4256d525c6ed25796c8e8049ec5a59b66579491e33"} Mar 17 01:35:16 crc kubenswrapper[4735]: I0317 01:35:16.467452 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" podStartSLOduration=2.948915395 podStartE2EDuration="3.467425463s" podCreationTimestamp="2026-03-17 01:35:13 +0000 UTC" firstStartedPulling="2026-03-17 01:35:14.812013475 +0000 UTC m=+1540.444246443" lastFinishedPulling="2026-03-17 01:35:15.330523523 +0000 UTC m=+1540.962756511" observedRunningTime="2026-03-17 01:35:16.460079874 +0000 UTC m=+1542.092312862" watchObservedRunningTime="2026-03-17 01:35:16.467425463 +0000 UTC m=+1542.099658451" Mar 17 01:35:42 crc kubenswrapper[4735]: I0317 01:35:42.606254 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:35:42 crc kubenswrapper[4735]: I0317 01:35:42.608908 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.594329 4735 scope.go:117] "RemoveContainer" containerID="3bacd3aafc06b6db3108ef9de8c2a3248c6ade2a6af01e1f1f7da54d51e19845" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.633215 4735 scope.go:117] "RemoveContainer" containerID="6e666d4e4c78c41a6d71e9a0fdf011dba9f4b17db66cfde061aff5aaf6438c05" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.687580 4735 scope.go:117] "RemoveContainer" containerID="ac7adbd2d4d069cd0191b2861ad467436e6dd5b80fd688426f3f0caaf4d4048e" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.748227 4735 scope.go:117] "RemoveContainer" containerID="6c22d0a33b7a7feb5c716d97ef6a490ca1f3f90d8d3e30b968cf89ada6c7f3db" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.794916 4735 scope.go:117] "RemoveContainer" containerID="f91aa2371c330a548923615f99c1fd2995b9c30c9ba7438cd3426ffed2aadb00" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.831191 4735 scope.go:117] "RemoveContainer" containerID="0add670582044a8cc9dbed9bb3d32df42740b89106b4b002c039ddf98a005495" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.871949 4735 scope.go:117] "RemoveContainer" containerID="28057c7a8bb9f4b30d111669f97c3397827c24b0312f1d99a5cd90d8a0a1ce5f" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.912628 4735 scope.go:117] "RemoveContainer" containerID="a680d6fbf902d3c23d3b87a260006e1e97ade12bbe5307e7ae17b7161fad046a" Mar 17 01:35:51 crc kubenswrapper[4735]: I0317 01:35:51.934993 4735 scope.go:117] "RemoveContainer" containerID="8438c54bbe429594476cc802adbf16afabcc27cd0ebfba431d5a1cb790fc7682" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.147525 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561856-jrmwm"] Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.149467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.151458 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.151472 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.151521 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.163241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-jrmwm"] Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.293375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgwr\" (UniqueName: \"kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr\") pod \"auto-csr-approver-29561856-jrmwm\" (UID: \"e9596f2d-2d29-49d7-aed8-d0554e7ffcec\") " pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.395805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgwr\" (UniqueName: \"kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr\") pod \"auto-csr-approver-29561856-jrmwm\" (UID: \"e9596f2d-2d29-49d7-aed8-d0554e7ffcec\") " pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.412627 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgwr\" (UniqueName: \"kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr\") pod \"auto-csr-approver-29561856-jrmwm\" (UID: \"e9596f2d-2d29-49d7-aed8-d0554e7ffcec\") " pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.465315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.979756 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-jrmwm"] Mar 17 01:36:00 crc kubenswrapper[4735]: I0317 01:36:00.986962 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:36:02 crc kubenswrapper[4735]: I0317 01:36:02.012672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" event={"ID":"e9596f2d-2d29-49d7-aed8-d0554e7ffcec","Type":"ContainerStarted","Data":"5103f5a199dc50cea6f29e07c75f9804fb5595e6dc4089ac870986066aaa9251"} Mar 17 01:36:03 crc kubenswrapper[4735]: I0317 01:36:03.033056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" event={"ID":"e9596f2d-2d29-49d7-aed8-d0554e7ffcec","Type":"ContainerStarted","Data":"35beb1b65b3034503734723db98ea8fe9928dfa05c4c44756f9f196abcf7969b"} Mar 17 01:36:04 crc kubenswrapper[4735]: I0317 01:36:04.046788 4735 generic.go:334] "Generic (PLEG): container finished" podID="e9596f2d-2d29-49d7-aed8-d0554e7ffcec" containerID="35beb1b65b3034503734723db98ea8fe9928dfa05c4c44756f9f196abcf7969b" exitCode=0 Mar 17 01:36:04 crc kubenswrapper[4735]: I0317 01:36:04.046943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" event={"ID":"e9596f2d-2d29-49d7-aed8-d0554e7ffcec","Type":"ContainerDied","Data":"35beb1b65b3034503734723db98ea8fe9928dfa05c4c44756f9f196abcf7969b"} Mar 17 01:36:05 crc kubenswrapper[4735]: I0317 01:36:05.475144 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:05 crc kubenswrapper[4735]: I0317 01:36:05.618834 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrgwr\" (UniqueName: \"kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr\") pod \"e9596f2d-2d29-49d7-aed8-d0554e7ffcec\" (UID: \"e9596f2d-2d29-49d7-aed8-d0554e7ffcec\") " Mar 17 01:36:05 crc kubenswrapper[4735]: I0317 01:36:05.634894 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr" (OuterVolumeSpecName: "kube-api-access-zrgwr") pod "e9596f2d-2d29-49d7-aed8-d0554e7ffcec" (UID: "e9596f2d-2d29-49d7-aed8-d0554e7ffcec"). InnerVolumeSpecName "kube-api-access-zrgwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:05 crc kubenswrapper[4735]: I0317 01:36:05.724263 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrgwr\" (UniqueName: \"kubernetes.io/projected/e9596f2d-2d29-49d7-aed8-d0554e7ffcec-kube-api-access-zrgwr\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:06 crc kubenswrapper[4735]: I0317 01:36:06.072813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" event={"ID":"e9596f2d-2d29-49d7-aed8-d0554e7ffcec","Type":"ContainerDied","Data":"5103f5a199dc50cea6f29e07c75f9804fb5595e6dc4089ac870986066aaa9251"} Mar 17 01:36:06 crc kubenswrapper[4735]: I0317 01:36:06.072917 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5103f5a199dc50cea6f29e07c75f9804fb5595e6dc4089ac870986066aaa9251" Mar 17 01:36:06 crc kubenswrapper[4735]: I0317 01:36:06.072872 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-jrmwm" Mar 17 01:36:06 crc kubenswrapper[4735]: I0317 01:36:06.148902 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-jpv8n"] Mar 17 01:36:06 crc kubenswrapper[4735]: I0317 01:36:06.161936 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-jpv8n"] Mar 17 01:36:07 crc kubenswrapper[4735]: I0317 01:36:07.097705 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4f5e79-ad2d-4c96-8dae-7bd8360d2111" path="/var/lib/kubelet/pods/0b4f5e79-ad2d-4c96-8dae-7bd8360d2111/volumes" Mar 17 01:36:12 crc kubenswrapper[4735]: I0317 01:36:12.606134 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:36:12 crc kubenswrapper[4735]: I0317 01:36:12.606978 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:36:12 crc kubenswrapper[4735]: I0317 01:36:12.607055 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:36:12 crc kubenswrapper[4735]: I0317 01:36:12.608058 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:36:12 crc kubenswrapper[4735]: I0317 01:36:12.608136 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" gracePeriod=600 Mar 17 01:36:12 crc kubenswrapper[4735]: E0317 01:36:12.738157 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:36:13 crc kubenswrapper[4735]: I0317 01:36:13.152301 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" exitCode=0 Mar 17 01:36:13 crc kubenswrapper[4735]: I0317 01:36:13.152391 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba"} Mar 17 01:36:13 crc kubenswrapper[4735]: I0317 01:36:13.153061 4735 scope.go:117] "RemoveContainer" containerID="e8261aabe9466b0d5bcb35dde5a9ae7325de3d924eb46b6cef2441ec6483254b" Mar 17 01:36:13 crc kubenswrapper[4735]: I0317 01:36:13.154017 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:36:13 crc kubenswrapper[4735]: E0317 01:36:13.154679 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:36:27 crc kubenswrapper[4735]: I0317 01:36:27.073628 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:36:27 crc kubenswrapper[4735]: E0317 01:36:27.074554 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:36:40 crc kubenswrapper[4735]: I0317 01:36:40.073323 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:36:40 crc kubenswrapper[4735]: E0317 01:36:40.074038 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.626970 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:36:47 crc kubenswrapper[4735]: E0317 01:36:47.627953 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9596f2d-2d29-49d7-aed8-d0554e7ffcec" containerName="oc" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.627969 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9596f2d-2d29-49d7-aed8-d0554e7ffcec" containerName="oc" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.628227 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9596f2d-2d29-49d7-aed8-d0554e7ffcec" containerName="oc" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.635775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.642042 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.687544 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.687683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptvk\" (UniqueName: \"kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.687764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.789678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.789796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptvk\" (UniqueName: \"kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.789828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.790407 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.790573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.812011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptvk\" (UniqueName: \"kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk\") pod \"redhat-marketplace-psqg5\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:47 crc kubenswrapper[4735]: I0317 01:36:47.967414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:48 crc kubenswrapper[4735]: I0317 01:36:48.423320 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:36:48 crc kubenswrapper[4735]: I0317 01:36:48.527943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerStarted","Data":"623bb42e25c1640e87ede398f205080bad428f2556cbe2467845fe13bb711dc9"} Mar 17 01:36:49 crc kubenswrapper[4735]: I0317 01:36:49.553522 4735 generic.go:334] "Generic (PLEG): container finished" podID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerID="e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594" exitCode=0 Mar 17 01:36:49 crc kubenswrapper[4735]: I0317 01:36:49.553639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerDied","Data":"e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594"} Mar 17 01:36:50 crc kubenswrapper[4735]: I0317 01:36:50.594752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerStarted","Data":"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5"} Mar 17 01:36:51 crc kubenswrapper[4735]: I0317 01:36:51.076500 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:36:51 crc kubenswrapper[4735]: E0317 01:36:51.076981 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.054927 4735 scope.go:117] "RemoveContainer" containerID="9ba2851058879bceee1975ee5d1b42d679dbe16c6425dafb892c4528d7b4174e" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.112514 4735 scope.go:117] "RemoveContainer" containerID="c103170577fe4a9863aef3582a529d0a8802811946102c994ebca36ea21e535d" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.146068 4735 scope.go:117] "RemoveContainer" containerID="a4a2469a21ca4659b0a2d58bffb535ca7d9475ef2bce7abb4791fdd7aae28a5f" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.184560 4735 scope.go:117] "RemoveContainer" containerID="08513105419d0794b474f0e1b3d37209bb8aa9050cb4b6196b78890f7b7104e4" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.214010 4735 scope.go:117] "RemoveContainer" containerID="a8ad0e32b377d13a77f249cde37ff356a4b57b06deabf7dee6bd1ab518148e4a" Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.622533 4735 generic.go:334] "Generic (PLEG): container finished" podID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerID="689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5" exitCode=0 Mar 17 01:36:52 crc kubenswrapper[4735]: I0317 01:36:52.622585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerDied","Data":"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5"} Mar 17 01:36:53 crc kubenswrapper[4735]: I0317 01:36:53.638950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerStarted","Data":"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d"} Mar 17 01:36:53 crc kubenswrapper[4735]: I0317 01:36:53.663571 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-psqg5" podStartSLOduration=3.182830961 podStartE2EDuration="6.663551535s" podCreationTimestamp="2026-03-17 01:36:47 +0000 UTC" firstStartedPulling="2026-03-17 01:36:49.556338372 +0000 UTC m=+1635.188571390" lastFinishedPulling="2026-03-17 01:36:53.037058976 +0000 UTC m=+1638.669291964" observedRunningTime="2026-03-17 01:36:53.655360692 +0000 UTC m=+1639.287593670" watchObservedRunningTime="2026-03-17 01:36:53.663551535 +0000 UTC m=+1639.295784503" Mar 17 01:36:57 crc kubenswrapper[4735]: I0317 01:36:57.968484 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:57 crc kubenswrapper[4735]: I0317 01:36:57.970213 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:58 crc kubenswrapper[4735]: I0317 01:36:58.029357 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:58 crc kubenswrapper[4735]: I0317 01:36:58.763649 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:36:58 crc kubenswrapper[4735]: I0317 01:36:58.837624 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:37:00 crc kubenswrapper[4735]: I0317 01:37:00.739894 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-psqg5" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="registry-server" containerID="cri-o://aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d" gracePeriod=2 Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.346235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.461939 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content\") pod \"eca4ab4e-cbd9-4d26-af38-80645720a89e\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.462019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nptvk\" (UniqueName: \"kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk\") pod \"eca4ab4e-cbd9-4d26-af38-80645720a89e\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.462098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities\") pod \"eca4ab4e-cbd9-4d26-af38-80645720a89e\" (UID: \"eca4ab4e-cbd9-4d26-af38-80645720a89e\") " Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.463427 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities" (OuterVolumeSpecName: "utilities") pod "eca4ab4e-cbd9-4d26-af38-80645720a89e" (UID: "eca4ab4e-cbd9-4d26-af38-80645720a89e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.469177 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk" (OuterVolumeSpecName: "kube-api-access-nptvk") pod "eca4ab4e-cbd9-4d26-af38-80645720a89e" (UID: "eca4ab4e-cbd9-4d26-af38-80645720a89e"). InnerVolumeSpecName "kube-api-access-nptvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.488574 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eca4ab4e-cbd9-4d26-af38-80645720a89e" (UID: "eca4ab4e-cbd9-4d26-af38-80645720a89e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.564355 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.564400 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nptvk\" (UniqueName: \"kubernetes.io/projected/eca4ab4e-cbd9-4d26-af38-80645720a89e-kube-api-access-nptvk\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.564414 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca4ab4e-cbd9-4d26-af38-80645720a89e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.752192 4735 generic.go:334] "Generic (PLEG): container finished" podID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerID="aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d" exitCode=0 Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.752235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerDied","Data":"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d"} Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.752295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psqg5" event={"ID":"eca4ab4e-cbd9-4d26-af38-80645720a89e","Type":"ContainerDied","Data":"623bb42e25c1640e87ede398f205080bad428f2556cbe2467845fe13bb711dc9"} Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.752316 4735 scope.go:117] "RemoveContainer" containerID="aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.752995 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psqg5" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.784491 4735 scope.go:117] "RemoveContainer" containerID="689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.808312 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.820361 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-psqg5"] Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.840252 4735 scope.go:117] "RemoveContainer" containerID="e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.874361 4735 scope.go:117] "RemoveContainer" containerID="aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d" Mar 17 01:37:01 crc kubenswrapper[4735]: E0317 01:37:01.875843 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d\": container with ID starting with aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d not found: ID does not exist" containerID="aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.875895 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d"} err="failed to get container status \"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d\": rpc error: code = NotFound desc = could not find container \"aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d\": container with ID starting with aaa2d9e93c212ae363491e8b57ea0c12d5ace5d3104dd2794417b9fcdeeefa6d not found: ID does not exist" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.875915 4735 scope.go:117] "RemoveContainer" containerID="689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5" Mar 17 01:37:01 crc kubenswrapper[4735]: E0317 01:37:01.876607 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5\": container with ID starting with 689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5 not found: ID does not exist" containerID="689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.876624 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5"} err="failed to get container status \"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5\": rpc error: code = NotFound desc = could not find container \"689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5\": container with ID starting with 689f126c8199e4470252c29051313d9592c78e57b584723013d470d06ca880d5 not found: ID does not exist" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.876636 4735 scope.go:117] "RemoveContainer" containerID="e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594" Mar 17 01:37:01 crc kubenswrapper[4735]: E0317 01:37:01.876823 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594\": container with ID starting with e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594 not found: ID does not exist" containerID="e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594" Mar 17 01:37:01 crc kubenswrapper[4735]: I0317 01:37:01.876837 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594"} err="failed to get container status \"e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594\": rpc error: code = NotFound desc = could not find container \"e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594\": container with ID starting with e8494b68443bb2b189076cb47cb8829fb7989b3ccf491b242a8adbf4149b3594 not found: ID does not exist" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.073871 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:37:03 crc kubenswrapper[4735]: E0317 01:37:03.074498 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.089387 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" path="/var/lib/kubelet/pods/eca4ab4e-cbd9-4d26-af38-80645720a89e/volumes" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.685552 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:37:03 crc kubenswrapper[4735]: E0317 01:37:03.686058 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="extract-content" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.686085 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="extract-content" Mar 17 01:37:03 crc kubenswrapper[4735]: E0317 01:37:03.686123 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="extract-utilities" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.686133 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="extract-utilities" Mar 17 01:37:03 crc kubenswrapper[4735]: E0317 01:37:03.686146 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="registry-server" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.686155 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="registry-server" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.686386 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca4ab4e-cbd9-4d26-af38-80645720a89e" containerName="registry-server" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.688174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.743381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.752013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.752079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.752229 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dxk\" (UniqueName: \"kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.854185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.854244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.854353 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dxk\" (UniqueName: \"kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.854649 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.854760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:03 crc kubenswrapper[4735]: I0317 01:37:03.873528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dxk\" (UniqueName: \"kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk\") pod \"certified-operators-tfwtc\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:04 crc kubenswrapper[4735]: I0317 01:37:04.063356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:04 crc kubenswrapper[4735]: I0317 01:37:04.565474 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:37:04 crc kubenswrapper[4735]: I0317 01:37:04.800965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerStarted","Data":"b19b2079393359e89db20e4efeb8879bcdce29dac8963251084fe84bf601c558"} Mar 17 01:37:04 crc kubenswrapper[4735]: I0317 01:37:04.801028 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerStarted","Data":"989891560fda3ac0a087793cfdcbf03cf40d64587167a4e3939286bcf8b62dcd"} Mar 17 01:37:05 crc kubenswrapper[4735]: I0317 01:37:05.813369 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerID="b19b2079393359e89db20e4efeb8879bcdce29dac8963251084fe84bf601c558" exitCode=0 Mar 17 01:37:05 crc kubenswrapper[4735]: I0317 01:37:05.813720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerDied","Data":"b19b2079393359e89db20e4efeb8879bcdce29dac8963251084fe84bf601c558"} Mar 17 01:37:09 crc kubenswrapper[4735]: I0317 01:37:09.859247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerStarted","Data":"16140600b1ad6e6b0f966f760916bd2f1045fce16e5f58a933e57fdfa7818b24"} Mar 17 01:37:10 crc kubenswrapper[4735]: I0317 01:37:10.873957 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerID="16140600b1ad6e6b0f966f760916bd2f1045fce16e5f58a933e57fdfa7818b24" exitCode=0 Mar 17 01:37:10 crc kubenswrapper[4735]: I0317 01:37:10.874034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerDied","Data":"16140600b1ad6e6b0f966f760916bd2f1045fce16e5f58a933e57fdfa7818b24"} Mar 17 01:37:11 crc kubenswrapper[4735]: I0317 01:37:11.896593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerStarted","Data":"4a11a07354712c109887152b4feb62031a0cd1e0f00d4b7e22db85221970d721"} Mar 17 01:37:11 crc kubenswrapper[4735]: I0317 01:37:11.929493 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfwtc" podStartSLOduration=2.421093072 podStartE2EDuration="8.929473929s" podCreationTimestamp="2026-03-17 01:37:03 +0000 UTC" firstStartedPulling="2026-03-17 01:37:04.803330128 +0000 UTC m=+1650.435563106" lastFinishedPulling="2026-03-17 01:37:11.311710955 +0000 UTC m=+1656.943943963" observedRunningTime="2026-03-17 01:37:11.927453782 +0000 UTC m=+1657.559686760" watchObservedRunningTime="2026-03-17 01:37:11.929473929 +0000 UTC m=+1657.561706907" Mar 17 01:37:14 crc kubenswrapper[4735]: I0317 01:37:14.063536 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:14 crc kubenswrapper[4735]: I0317 01:37:14.063931 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:15 crc kubenswrapper[4735]: I0317 01:37:15.121810 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-tfwtc" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="registry-server" probeResult="failure" output=< Mar 17 01:37:15 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:37:15 crc kubenswrapper[4735]: > Mar 17 01:37:18 crc kubenswrapper[4735]: I0317 01:37:18.072720 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:37:18 crc kubenswrapper[4735]: E0317 01:37:18.073812 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:37:24 crc kubenswrapper[4735]: I0317 01:37:24.114249 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:24 crc kubenswrapper[4735]: I0317 01:37:24.173919 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:37:24 crc kubenswrapper[4735]: I0317 01:37:24.267516 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:37:24 crc kubenswrapper[4735]: I0317 01:37:24.379423 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:37:24 crc kubenswrapper[4735]: I0317 01:37:24.379953 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9sppg" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" containerID="cri-o://0092f459ef06a1b51c84737b1c3cf7425ae7bc5a4a5284c2ff56ae125fb55a1e" gracePeriod=2 Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.047362 4735 generic.go:334] "Generic (PLEG): container finished" podID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerID="0092f459ef06a1b51c84737b1c3cf7425ae7bc5a4a5284c2ff56ae125fb55a1e" exitCode=0 Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.048668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerDied","Data":"0092f459ef06a1b51c84737b1c3cf7425ae7bc5a4a5284c2ff56ae125fb55a1e"} Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.048756 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sppg" event={"ID":"d5d9a075-a0a3-4aae-bd44-459f3df46522","Type":"ContainerDied","Data":"e5a8a1536d93b308b8568799944693dc5fd93abf1165c6a59ea232a0a5f73857"} Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.048827 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a8a1536d93b308b8568799944693dc5fd93abf1165c6a59ea232a0a5f73857" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.087036 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.210275 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh8z8\" (UniqueName: \"kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8\") pod \"d5d9a075-a0a3-4aae-bd44-459f3df46522\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.210699 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities\") pod \"d5d9a075-a0a3-4aae-bd44-459f3df46522\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.211096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities" (OuterVolumeSpecName: "utilities") pod "d5d9a075-a0a3-4aae-bd44-459f3df46522" (UID: "d5d9a075-a0a3-4aae-bd44-459f3df46522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.211279 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content\") pod \"d5d9a075-a0a3-4aae-bd44-459f3df46522\" (UID: \"d5d9a075-a0a3-4aae-bd44-459f3df46522\") " Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.213592 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.216518 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8" (OuterVolumeSpecName: "kube-api-access-sh8z8") pod "d5d9a075-a0a3-4aae-bd44-459f3df46522" (UID: "d5d9a075-a0a3-4aae-bd44-459f3df46522"). InnerVolumeSpecName "kube-api-access-sh8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.266011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5d9a075-a0a3-4aae-bd44-459f3df46522" (UID: "d5d9a075-a0a3-4aae-bd44-459f3df46522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.315779 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh8z8\" (UniqueName: \"kubernetes.io/projected/d5d9a075-a0a3-4aae-bd44-459f3df46522-kube-api-access-sh8z8\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4735]: I0317 01:37:25.316016 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d9a075-a0a3-4aae-bd44-459f3df46522-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:26 crc kubenswrapper[4735]: I0317 01:37:26.053766 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sppg" Mar 17 01:37:26 crc kubenswrapper[4735]: I0317 01:37:26.089773 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:37:26 crc kubenswrapper[4735]: I0317 01:37:26.103993 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9sppg"] Mar 17 01:37:27 crc kubenswrapper[4735]: I0317 01:37:27.091382 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" path="/var/lib/kubelet/pods/d5d9a075-a0a3-4aae-bd44-459f3df46522/volumes" Mar 17 01:37:33 crc kubenswrapper[4735]: I0317 01:37:33.073897 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:37:33 crc kubenswrapper[4735]: E0317 01:37:33.078470 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.920660 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:37:38 crc kubenswrapper[4735]: E0317 01:37:38.921675 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.921693 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" Mar 17 01:37:38 crc kubenswrapper[4735]: E0317 01:37:38.921710 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="extract-utilities" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.921719 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="extract-utilities" Mar 17 01:37:38 crc kubenswrapper[4735]: E0317 01:37:38.921741 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="extract-content" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.921748 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="extract-content" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.921985 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d9a075-a0a3-4aae-bd44-459f3df46522" containerName="registry-server" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.923639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:38 crc kubenswrapper[4735]: I0317 01:37:38.941767 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.122561 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.122639 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhts\" (UniqueName: \"kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.122666 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.224408 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.224529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhts\" (UniqueName: \"kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.224554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.225621 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.225720 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.259556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhts\" (UniqueName: \"kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts\") pod \"redhat-operators-mbtvv\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:39 crc kubenswrapper[4735]: I0317 01:37:39.544466 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:40 crc kubenswrapper[4735]: I0317 01:37:40.019145 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:37:40 crc kubenswrapper[4735]: W0317 01:37:40.025297 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2335aea6_84be_49c5_baed_90d88d2f75e7.slice/crio-877865478a32630d62f078d391226546da4d3f7a75027ceea4ff14819f539786 WatchSource:0}: Error finding container 877865478a32630d62f078d391226546da4d3f7a75027ceea4ff14819f539786: Status 404 returned error can't find the container with id 877865478a32630d62f078d391226546da4d3f7a75027ceea4ff14819f539786 Mar 17 01:37:40 crc kubenswrapper[4735]: I0317 01:37:40.238427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerStarted","Data":"877865478a32630d62f078d391226546da4d3f7a75027ceea4ff14819f539786"} Mar 17 01:37:41 crc kubenswrapper[4735]: I0317 01:37:41.251832 4735 generic.go:334] "Generic (PLEG): container finished" podID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerID="049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf" exitCode=0 Mar 17 01:37:41 crc kubenswrapper[4735]: I0317 01:37:41.251910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerDied","Data":"049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf"} Mar 17 01:37:43 crc kubenswrapper[4735]: I0317 01:37:43.278564 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerStarted","Data":"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0"} Mar 17 01:37:46 crc kubenswrapper[4735]: I0317 01:37:46.072657 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:37:46 crc kubenswrapper[4735]: E0317 01:37:46.073336 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:37:50 crc kubenswrapper[4735]: I0317 01:37:50.395716 4735 generic.go:334] "Generic (PLEG): container finished" podID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerID="9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0" exitCode=0 Mar 17 01:37:50 crc kubenswrapper[4735]: I0317 01:37:50.395779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerDied","Data":"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0"} Mar 17 01:37:51 crc kubenswrapper[4735]: I0317 01:37:51.407041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerStarted","Data":"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9"} Mar 17 01:37:51 crc kubenswrapper[4735]: I0317 01:37:51.442349 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mbtvv" podStartSLOduration=3.921354008 podStartE2EDuration="13.442332452s" podCreationTimestamp="2026-03-17 01:37:38 +0000 UTC" firstStartedPulling="2026-03-17 01:37:41.254896549 +0000 UTC m=+1686.887129567" lastFinishedPulling="2026-03-17 01:37:50.775875033 +0000 UTC m=+1696.408108011" observedRunningTime="2026-03-17 01:37:51.430805281 +0000 UTC m=+1697.063038269" watchObservedRunningTime="2026-03-17 01:37:51.442332452 +0000 UTC m=+1697.074565430" Mar 17 01:37:52 crc kubenswrapper[4735]: I0317 01:37:52.326404 4735 scope.go:117] "RemoveContainer" containerID="0092f459ef06a1b51c84737b1c3cf7425ae7bc5a4a5284c2ff56ae125fb55a1e" Mar 17 01:37:52 crc kubenswrapper[4735]: I0317 01:37:52.376179 4735 scope.go:117] "RemoveContainer" containerID="d3476732ac9c7a11b80e6341abb3f3f81f7fa92f2777a9e014f9cd164ca59a61" Mar 17 01:37:52 crc kubenswrapper[4735]: I0317 01:37:52.411328 4735 scope.go:117] "RemoveContainer" containerID="87b5b9f1a2d765d08cb6dd2182b286fb6974d93ffc1ea6ba6419fd82a78fef95" Mar 17 01:37:58 crc kubenswrapper[4735]: I0317 01:37:58.073992 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:37:58 crc kubenswrapper[4735]: E0317 01:37:58.077012 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:37:59 crc kubenswrapper[4735]: I0317 01:37:59.544967 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:37:59 crc kubenswrapper[4735]: I0317 01:37:59.545313 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.175680 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561858-hrpqm"] Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.178456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.180942 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.181814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.183343 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.197750 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-hrpqm"] Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.261715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st68z\" (UniqueName: \"kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z\") pod \"auto-csr-approver-29561858-hrpqm\" (UID: \"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2\") " pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.363635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st68z\" (UniqueName: \"kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z\") pod \"auto-csr-approver-29561858-hrpqm\" (UID: \"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2\") " pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.391533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st68z\" (UniqueName: \"kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z\") pod \"auto-csr-approver-29561858-hrpqm\" (UID: \"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2\") " pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.499890 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:00 crc kubenswrapper[4735]: I0317 01:38:00.616838 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mbtvv" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="registry-server" probeResult="failure" output=< Mar 17 01:38:00 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:38:00 crc kubenswrapper[4735]: > Mar 17 01:38:01 crc kubenswrapper[4735]: W0317 01:38:01.040192 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4cb487f_a9aa_4d54_8da1_7b45eae1cff2.slice/crio-44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d WatchSource:0}: Error finding container 44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d: Status 404 returned error can't find the container with id 44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d Mar 17 01:38:01 crc kubenswrapper[4735]: I0317 01:38:01.043996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-hrpqm"] Mar 17 01:38:01 crc kubenswrapper[4735]: I0317 01:38:01.539953 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" event={"ID":"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2","Type":"ContainerStarted","Data":"44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d"} Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.089472 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1db9-account-create-update-sqmm7"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.102546 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-v6brj"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.121776 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0c55-account-create-update-7msf2"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.131102 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1db9-account-create-update-sqmm7"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.140598 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-nw7cf"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.153053 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-v6brj"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.162511 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-nw7cf"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.171931 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0c55-account-create-update-7msf2"] Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.551571 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" event={"ID":"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2","Type":"ContainerStarted","Data":"c3ed463722cf029fd37850d25d47594dc9c642df9a6a56a344cd8213261c2549"} Mar 17 01:38:02 crc kubenswrapper[4735]: I0317 01:38:02.569940 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" podStartSLOduration=1.571388695 podStartE2EDuration="2.569917471s" podCreationTimestamp="2026-03-17 01:38:00 +0000 UTC" firstStartedPulling="2026-03-17 01:38:01.042320066 +0000 UTC m=+1706.674553044" lastFinishedPulling="2026-03-17 01:38:02.040848822 +0000 UTC m=+1707.673081820" observedRunningTime="2026-03-17 01:38:02.56473187 +0000 UTC m=+1708.196964848" watchObservedRunningTime="2026-03-17 01:38:02.569917471 +0000 UTC m=+1708.202150479" Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.090055 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6b4495-838e-4cda-9d1f-7b087fe4ec50" path="/var/lib/kubelet/pods/1e6b4495-838e-4cda-9d1f-7b087fe4ec50/volumes" Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.094619 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6678bd59-bc5f-4441-b5c7-f68abbf8f385" path="/var/lib/kubelet/pods/6678bd59-bc5f-4441-b5c7-f68abbf8f385/volumes" Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.095925 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b81ce1-9517-4b0f-bae1-fd5e586e98a9" path="/var/lib/kubelet/pods/c2b81ce1-9517-4b0f-bae1-fd5e586e98a9/volumes" Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.096707 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3dfa49-bee3-46cf-982d-6b126b72a46b" path="/var/lib/kubelet/pods/ee3dfa49-bee3-46cf-982d-6b126b72a46b/volumes" Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.573793 4735 generic.go:334] "Generic (PLEG): container finished" podID="a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" containerID="c3ed463722cf029fd37850d25d47594dc9c642df9a6a56a344cd8213261c2549" exitCode=0 Mar 17 01:38:03 crc kubenswrapper[4735]: I0317 01:38:03.573935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" event={"ID":"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2","Type":"ContainerDied","Data":"c3ed463722cf029fd37850d25d47594dc9c642df9a6a56a344cd8213261c2549"} Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.025260 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.070582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st68z\" (UniqueName: \"kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z\") pod \"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2\" (UID: \"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2\") " Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.077984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z" (OuterVolumeSpecName: "kube-api-access-st68z") pod "a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" (UID: "a4cb487f-a9aa-4d54-8da1-7b45eae1cff2"). InnerVolumeSpecName "kube-api-access-st68z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.174323 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st68z\" (UniqueName: \"kubernetes.io/projected/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2-kube-api-access-st68z\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.592586 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" event={"ID":"a4cb487f-a9aa-4d54-8da1-7b45eae1cff2","Type":"ContainerDied","Data":"44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d"} Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.592803 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b52035e2a1353e8f4b0d54b13e93297a5e38191083ec859ec9fc8ab964597d" Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.592879 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-hrpqm" Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.631000 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-2pfnc"] Mar 17 01:38:05 crc kubenswrapper[4735]: I0317 01:38:05.638221 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-2pfnc"] Mar 17 01:38:06 crc kubenswrapper[4735]: I0317 01:38:06.030405 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-af95-account-create-update-lvxl5"] Mar 17 01:38:06 crc kubenswrapper[4735]: I0317 01:38:06.042345 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-af95-account-create-update-lvxl5"] Mar 17 01:38:06 crc kubenswrapper[4735]: I0317 01:38:06.048561 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fjwfz"] Mar 17 01:38:06 crc kubenswrapper[4735]: I0317 01:38:06.055232 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fjwfz"] Mar 17 01:38:07 crc kubenswrapper[4735]: I0317 01:38:07.092057 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ee9a4a-f88f-498f-b41d-d5dd25a00f41" path="/var/lib/kubelet/pods/76ee9a4a-f88f-498f-b41d-d5dd25a00f41/volumes" Mar 17 01:38:07 crc kubenswrapper[4735]: I0317 01:38:07.095891 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77522492-88e3-46d7-a525-badec22c6c4b" path="/var/lib/kubelet/pods/77522492-88e3-46d7-a525-badec22c6c4b/volumes" Mar 17 01:38:07 crc kubenswrapper[4735]: I0317 01:38:07.102573 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02b1480-3872-40f9-98fb-f04132fba4c4" path="/var/lib/kubelet/pods/d02b1480-3872-40f9-98fb-f04132fba4c4/volumes" Mar 17 01:38:09 crc kubenswrapper[4735]: I0317 01:38:09.603129 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:38:09 crc kubenswrapper[4735]: I0317 01:38:09.659472 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:38:10 crc kubenswrapper[4735]: I0317 01:38:10.123022 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:38:10 crc kubenswrapper[4735]: I0317 01:38:10.655998 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mbtvv" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="registry-server" containerID="cri-o://cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9" gracePeriod=2 Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.168994 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.202393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content\") pod \"2335aea6-84be-49c5-baed-90d88d2f75e7\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.202550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhts\" (UniqueName: \"kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts\") pod \"2335aea6-84be-49c5-baed-90d88d2f75e7\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.202701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities\") pod \"2335aea6-84be-49c5-baed-90d88d2f75e7\" (UID: \"2335aea6-84be-49c5-baed-90d88d2f75e7\") " Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.203361 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities" (OuterVolumeSpecName: "utilities") pod "2335aea6-84be-49c5-baed-90d88d2f75e7" (UID: "2335aea6-84be-49c5-baed-90d88d2f75e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.204353 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.213163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts" (OuterVolumeSpecName: "kube-api-access-hlhts") pod "2335aea6-84be-49c5-baed-90d88d2f75e7" (UID: "2335aea6-84be-49c5-baed-90d88d2f75e7"). InnerVolumeSpecName "kube-api-access-hlhts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.307168 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlhts\" (UniqueName: \"kubernetes.io/projected/2335aea6-84be-49c5-baed-90d88d2f75e7-kube-api-access-hlhts\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.335233 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2335aea6-84be-49c5-baed-90d88d2f75e7" (UID: "2335aea6-84be-49c5-baed-90d88d2f75e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.408895 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2335aea6-84be-49c5-baed-90d88d2f75e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.668479 4735 generic.go:334] "Generic (PLEG): container finished" podID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerID="cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9" exitCode=0 Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.668623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerDied","Data":"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9"} Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.668824 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mbtvv" event={"ID":"2335aea6-84be-49c5-baed-90d88d2f75e7","Type":"ContainerDied","Data":"877865478a32630d62f078d391226546da4d3f7a75027ceea4ff14819f539786"} Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.668853 4735 scope.go:117] "RemoveContainer" containerID="cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.668682 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mbtvv" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.711350 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.713744 4735 scope.go:117] "RemoveContainer" containerID="9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.723011 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mbtvv"] Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.748786 4735 scope.go:117] "RemoveContainer" containerID="049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.808157 4735 scope.go:117] "RemoveContainer" containerID="cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9" Mar 17 01:38:11 crc kubenswrapper[4735]: E0317 01:38:11.808671 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9\": container with ID starting with cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9 not found: ID does not exist" containerID="cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.808709 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9"} err="failed to get container status \"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9\": rpc error: code = NotFound desc = could not find container \"cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9\": container with ID starting with cf3d81d9ff7125f79e250871dc51b1364fe1bf0c2e9218660f226e480a5ac5a9 not found: ID does not exist" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.808731 4735 scope.go:117] "RemoveContainer" containerID="9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0" Mar 17 01:38:11 crc kubenswrapper[4735]: E0317 01:38:11.809289 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0\": container with ID starting with 9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0 not found: ID does not exist" containerID="9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.809376 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0"} err="failed to get container status \"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0\": rpc error: code = NotFound desc = could not find container \"9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0\": container with ID starting with 9961b56b944be611abdc22b37de0b5e6c086036afe172e2135744abb66197ce0 not found: ID does not exist" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.809416 4735 scope.go:117] "RemoveContainer" containerID="049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf" Mar 17 01:38:11 crc kubenswrapper[4735]: E0317 01:38:11.810019 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf\": container with ID starting with 049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf not found: ID does not exist" containerID="049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf" Mar 17 01:38:11 crc kubenswrapper[4735]: I0317 01:38:11.810044 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf"} err="failed to get container status \"049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf\": rpc error: code = NotFound desc = could not find container \"049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf\": container with ID starting with 049f7a6cbf2e24439c7cdf09023144ce290869a63cf2f604ec786e584d042bbf not found: ID does not exist" Mar 17 01:38:12 crc kubenswrapper[4735]: I0317 01:38:12.073382 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:38:12 crc kubenswrapper[4735]: E0317 01:38:12.073620 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:38:13 crc kubenswrapper[4735]: I0317 01:38:13.091575 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" path="/var/lib/kubelet/pods/2335aea6-84be-49c5-baed-90d88d2f75e7/volumes" Mar 17 01:38:14 crc kubenswrapper[4735]: I0317 01:38:14.719937 4735 generic.go:334] "Generic (PLEG): container finished" podID="33665f4a-8504-4b35-9850-d2a567b93418" containerID="e01f1632860b0689f38daf4256d525c6ed25796c8e8049ec5a59b66579491e33" exitCode=0 Mar 17 01:38:14 crc kubenswrapper[4735]: I0317 01:38:14.720021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" event={"ID":"33665f4a-8504-4b35-9850-d2a567b93418","Type":"ContainerDied","Data":"e01f1632860b0689f38daf4256d525c6ed25796c8e8049ec5a59b66579491e33"} Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.163988 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.211331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam\") pod \"33665f4a-8504-4b35-9850-d2a567b93418\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.211548 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory\") pod \"33665f4a-8504-4b35-9850-d2a567b93418\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.211601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89s6m\" (UniqueName: \"kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m\") pod \"33665f4a-8504-4b35-9850-d2a567b93418\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.211626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle\") pod \"33665f4a-8504-4b35-9850-d2a567b93418\" (UID: \"33665f4a-8504-4b35-9850-d2a567b93418\") " Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.245192 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m" (OuterVolumeSpecName: "kube-api-access-89s6m") pod "33665f4a-8504-4b35-9850-d2a567b93418" (UID: "33665f4a-8504-4b35-9850-d2a567b93418"). InnerVolumeSpecName "kube-api-access-89s6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.250060 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "33665f4a-8504-4b35-9850-d2a567b93418" (UID: "33665f4a-8504-4b35-9850-d2a567b93418"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.286807 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "33665f4a-8504-4b35-9850-d2a567b93418" (UID: "33665f4a-8504-4b35-9850-d2a567b93418"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.316016 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89s6m\" (UniqueName: \"kubernetes.io/projected/33665f4a-8504-4b35-9850-d2a567b93418-kube-api-access-89s6m\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.316045 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.316055 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.319117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory" (OuterVolumeSpecName: "inventory") pod "33665f4a-8504-4b35-9850-d2a567b93418" (UID: "33665f4a-8504-4b35-9850-d2a567b93418"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.417552 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33665f4a-8504-4b35-9850-d2a567b93418-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.745841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" event={"ID":"33665f4a-8504-4b35-9850-d2a567b93418","Type":"ContainerDied","Data":"2b4c31010656f2d522f50f7b6a0e54a75b037343d567a04fd7178b6f84d5208e"} Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.745912 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4c31010656f2d522f50f7b6a0e54a75b037343d567a04fd7178b6f84d5208e" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.746010 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832489 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj"] Mar 17 01:38:16 crc kubenswrapper[4735]: E0317 01:38:16.832848 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="extract-content" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832882 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="extract-content" Mar 17 01:38:16 crc kubenswrapper[4735]: E0317 01:38:16.832898 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="registry-server" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832905 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="registry-server" Mar 17 01:38:16 crc kubenswrapper[4735]: E0317 01:38:16.832918 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" containerName="oc" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832926 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" containerName="oc" Mar 17 01:38:16 crc kubenswrapper[4735]: E0317 01:38:16.832944 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="extract-utilities" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="extract-utilities" Mar 17 01:38:16 crc kubenswrapper[4735]: E0317 01:38:16.832966 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33665f4a-8504-4b35-9850-d2a567b93418" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.832972 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33665f4a-8504-4b35-9850-d2a567b93418" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.833145 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2335aea6-84be-49c5-baed-90d88d2f75e7" containerName="registry-server" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.833158 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" containerName="oc" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.833172 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="33665f4a-8504-4b35-9850-d2a567b93418" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.833730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.838713 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.838905 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.839033 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.839154 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.857843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj"] Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.927207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.927472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkns\" (UniqueName: \"kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:16 crc kubenswrapper[4735]: I0317 01:38:16.927553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.029666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.029973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkns\" (UniqueName: \"kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.030065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.033806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.034362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.047065 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkns\" (UniqueName: \"kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.159575 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.751438 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj"] Mar 17 01:38:17 crc kubenswrapper[4735]: I0317 01:38:17.753771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" event={"ID":"8697513f-4f89-4aad-9315-2e2053207433","Type":"ContainerStarted","Data":"d26e9d75b1a36c5682e0ff747f4bb12631834db5d984c20318d19a791e416dda"} Mar 17 01:38:18 crc kubenswrapper[4735]: I0317 01:38:18.035172 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2c69q"] Mar 17 01:38:18 crc kubenswrapper[4735]: I0317 01:38:18.046410 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2c69q"] Mar 17 01:38:18 crc kubenswrapper[4735]: I0317 01:38:18.766788 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" event={"ID":"8697513f-4f89-4aad-9315-2e2053207433","Type":"ContainerStarted","Data":"e4963107766b99303c3674424ab9145c3abb148bf85d18696023a566937701ad"} Mar 17 01:38:19 crc kubenswrapper[4735]: I0317 01:38:19.089131 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef" path="/var/lib/kubelet/pods/3e4061ea-70e0-4c6d-9b6b-8d4598ce0aef/volumes" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.480836 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" podStartSLOduration=4.968434973 podStartE2EDuration="5.480819569s" podCreationTimestamp="2026-03-17 01:38:16 +0000 UTC" firstStartedPulling="2026-03-17 01:38:17.731579622 +0000 UTC m=+1723.363812600" lastFinishedPulling="2026-03-17 01:38:18.243964198 +0000 UTC m=+1723.876197196" observedRunningTime="2026-03-17 01:38:18.785260754 +0000 UTC m=+1724.417493742" watchObservedRunningTime="2026-03-17 01:38:21.480819569 +0000 UTC m=+1727.113052547" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.485113 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.486912 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.510437 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.633565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.633606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxs2\" (UniqueName: \"kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.633719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.735493 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.735686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.735712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxs2\" (UniqueName: \"kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.735989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.736230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.754814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxs2\" (UniqueName: \"kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2\") pod \"community-operators-hgg2v\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:21 crc kubenswrapper[4735]: I0317 01:38:21.816905 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:22 crc kubenswrapper[4735]: I0317 01:38:22.307006 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:22 crc kubenswrapper[4735]: W0317 01:38:22.308237 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bab6a6_702e_4331_88a0_3ee8a9a1d009.slice/crio-5fbaf0d0866c9dffc914523dfbebf90f408216102f111fde5b0faa38569c927d WatchSource:0}: Error finding container 5fbaf0d0866c9dffc914523dfbebf90f408216102f111fde5b0faa38569c927d: Status 404 returned error can't find the container with id 5fbaf0d0866c9dffc914523dfbebf90f408216102f111fde5b0faa38569c927d Mar 17 01:38:22 crc kubenswrapper[4735]: I0317 01:38:22.811944 4735 generic.go:334] "Generic (PLEG): container finished" podID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerID="32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916" exitCode=0 Mar 17 01:38:22 crc kubenswrapper[4735]: I0317 01:38:22.812104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerDied","Data":"32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916"} Mar 17 01:38:22 crc kubenswrapper[4735]: I0317 01:38:22.812661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerStarted","Data":"5fbaf0d0866c9dffc914523dfbebf90f408216102f111fde5b0faa38569c927d"} Mar 17 01:38:23 crc kubenswrapper[4735]: I0317 01:38:23.073663 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:38:23 crc kubenswrapper[4735]: E0317 01:38:23.073890 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:38:23 crc kubenswrapper[4735]: I0317 01:38:23.825478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerStarted","Data":"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1"} Mar 17 01:38:25 crc kubenswrapper[4735]: I0317 01:38:25.846574 4735 generic.go:334] "Generic (PLEG): container finished" podID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerID="796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1" exitCode=0 Mar 17 01:38:25 crc kubenswrapper[4735]: I0317 01:38:25.846978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerDied","Data":"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1"} Mar 17 01:38:26 crc kubenswrapper[4735]: I0317 01:38:26.858455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerStarted","Data":"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569"} Mar 17 01:38:26 crc kubenswrapper[4735]: I0317 01:38:26.885883 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgg2v" podStartSLOduration=2.437241745 podStartE2EDuration="5.885845034s" podCreationTimestamp="2026-03-17 01:38:21 +0000 UTC" firstStartedPulling="2026-03-17 01:38:22.814974676 +0000 UTC m=+1728.447207654" lastFinishedPulling="2026-03-17 01:38:26.263577945 +0000 UTC m=+1731.895810943" observedRunningTime="2026-03-17 01:38:26.879394392 +0000 UTC m=+1732.511627370" watchObservedRunningTime="2026-03-17 01:38:26.885845034 +0000 UTC m=+1732.518078022" Mar 17 01:38:31 crc kubenswrapper[4735]: I0317 01:38:31.817629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:31 crc kubenswrapper[4735]: I0317 01:38:31.818088 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:31 crc kubenswrapper[4735]: I0317 01:38:31.885706 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:31 crc kubenswrapper[4735]: I0317 01:38:31.951568 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:32 crc kubenswrapper[4735]: I0317 01:38:32.132769 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:33 crc kubenswrapper[4735]: I0317 01:38:33.944866 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgg2v" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="registry-server" containerID="cri-o://a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569" gracePeriod=2 Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.452549 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.529141 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content\") pod \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.529212 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities\") pod \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.529251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxxs2\" (UniqueName: \"kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2\") pod \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\" (UID: \"01bab6a6-702e-4331-88a0-3ee8a9a1d009\") " Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.530661 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities" (OuterVolumeSpecName: "utilities") pod "01bab6a6-702e-4331-88a0-3ee8a9a1d009" (UID: "01bab6a6-702e-4331-88a0-3ee8a9a1d009"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.535595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2" (OuterVolumeSpecName: "kube-api-access-hxxs2") pod "01bab6a6-702e-4331-88a0-3ee8a9a1d009" (UID: "01bab6a6-702e-4331-88a0-3ee8a9a1d009"). InnerVolumeSpecName "kube-api-access-hxxs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.631966 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.631994 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxxs2\" (UniqueName: \"kubernetes.io/projected/01bab6a6-702e-4331-88a0-3ee8a9a1d009-kube-api-access-hxxs2\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.641974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01bab6a6-702e-4331-88a0-3ee8a9a1d009" (UID: "01bab6a6-702e-4331-88a0-3ee8a9a1d009"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.733551 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bab6a6-702e-4331-88a0-3ee8a9a1d009-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.955930 4735 generic.go:334] "Generic (PLEG): container finished" podID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerID="a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569" exitCode=0 Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.955980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerDied","Data":"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569"} Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.956009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2v" event={"ID":"01bab6a6-702e-4331-88a0-3ee8a9a1d009","Type":"ContainerDied","Data":"5fbaf0d0866c9dffc914523dfbebf90f408216102f111fde5b0faa38569c927d"} Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.956029 4735 scope.go:117] "RemoveContainer" containerID="a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569" Mar 17 01:38:34 crc kubenswrapper[4735]: I0317 01:38:34.956172 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2v" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.001179 4735 scope.go:117] "RemoveContainer" containerID="796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.008153 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.022297 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgg2v"] Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.032347 4735 scope.go:117] "RemoveContainer" containerID="32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.069259 4735 scope.go:117] "RemoveContainer" containerID="a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569" Mar 17 01:38:35 crc kubenswrapper[4735]: E0317 01:38:35.069809 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569\": container with ID starting with a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569 not found: ID does not exist" containerID="a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.069902 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569"} err="failed to get container status \"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569\": rpc error: code = NotFound desc = could not find container \"a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569\": container with ID starting with a2ad6cd46de937d22024f4ebc9ba0a7c487d7eeee26a83b0a6b648564129b569 not found: ID does not exist" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.069938 4735 scope.go:117] "RemoveContainer" containerID="796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1" Mar 17 01:38:35 crc kubenswrapper[4735]: E0317 01:38:35.070787 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1\": container with ID starting with 796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1 not found: ID does not exist" containerID="796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.070827 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1"} err="failed to get container status \"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1\": rpc error: code = NotFound desc = could not find container \"796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1\": container with ID starting with 796c3dfac388eea44759e073a531ce66305f0cbebc1230ce3fded14f991030a1 not found: ID does not exist" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.070944 4735 scope.go:117] "RemoveContainer" containerID="32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916" Mar 17 01:38:35 crc kubenswrapper[4735]: E0317 01:38:35.071321 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916\": container with ID starting with 32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916 not found: ID does not exist" containerID="32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.071366 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916"} err="failed to get container status \"32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916\": rpc error: code = NotFound desc = could not find container \"32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916\": container with ID starting with 32038f791e1f97ae80d3f87e4acb17786c6251e72c5b9e614de7a059efbd2916 not found: ID does not exist" Mar 17 01:38:35 crc kubenswrapper[4735]: I0317 01:38:35.094608 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" path="/var/lib/kubelet/pods/01bab6a6-702e-4331-88a0-3ee8a9a1d009/volumes" Mar 17 01:38:36 crc kubenswrapper[4735]: I0317 01:38:36.052689 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vsj4g"] Mar 17 01:38:36 crc kubenswrapper[4735]: I0317 01:38:36.062473 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vsj4g"] Mar 17 01:38:37 crc kubenswrapper[4735]: I0317 01:38:37.086774 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c0c683-4152-4360-a49f-22a72c25ad1c" path="/var/lib/kubelet/pods/70c0c683-4152-4360-a49f-22a72c25ad1c/volumes" Mar 17 01:38:38 crc kubenswrapper[4735]: I0317 01:38:38.073456 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:38:38 crc kubenswrapper[4735]: E0317 01:38:38.073960 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.503443 4735 scope.go:117] "RemoveContainer" containerID="cccd3a6760b6702e9d69c6188245bf360640cbe3c4dfc4f68224e1e2841ef165" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.539536 4735 scope.go:117] "RemoveContainer" containerID="6e6d50c8976a0d13bc7bc7f61efc7bcec9c0cea3cc50648f8545f814963fb1ca" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.581241 4735 scope.go:117] "RemoveContainer" containerID="3708d5d739a7b0d7bfe2ee97b550886a59361062400c7e342597fa5baad6e278" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.636081 4735 scope.go:117] "RemoveContainer" containerID="39fb346fce05fccbaaa9eb90bae49c62db4b24855650346e89adf8ea5154ad2b" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.681347 4735 scope.go:117] "RemoveContainer" containerID="6aafdd5cb311e4f1a4c682844f51021f4ac983a71a9add6b070c6e3b5040923a" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.699928 4735 scope.go:117] "RemoveContainer" containerID="c3c74813aabd7080ea6c26cf817a2dfdb7b4a87ac38118760ef61dd31cd50a24" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.721303 4735 scope.go:117] "RemoveContainer" containerID="a2758603db03a8c385747f6d3a18cc38c2694322e5dd0bffa04f1c7a165bafa3" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.761178 4735 scope.go:117] "RemoveContainer" containerID="d0c1f186b72363bd57f61da70ab2cff09fccf6706495b67df421fd61a118cab3" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.779924 4735 scope.go:117] "RemoveContainer" containerID="91faf749add6a123dabdccc8d3cf17c7b85b229fa2e2ca1be83a410028ef93c7" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.834383 4735 scope.go:117] "RemoveContainer" containerID="11c8ba407d5421d1afc3bcd2b34a7955529e2b4abea59081714815d3bc27d06a" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.868437 4735 scope.go:117] "RemoveContainer" containerID="4d6ad7277b88a97c02ef32dc3a50769fc6d1b17c26a84bbf63a7578431d82106" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.896591 4735 scope.go:117] "RemoveContainer" containerID="b7539710c9b3565c378067c0fd583547bb5bacd3f09bbe9b6ea86658c5361ecd" Mar 17 01:38:52 crc kubenswrapper[4735]: I0317 01:38:52.924693 4735 scope.go:117] "RemoveContainer" containerID="037fb4e644505ab8593322897ca936e5d8b7c7a907380a68a6d599453e33be17" Mar 17 01:38:53 crc kubenswrapper[4735]: I0317 01:38:53.074426 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:38:53 crc kubenswrapper[4735]: E0317 01:38:53.074919 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.113403 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9b09-account-create-update-mtbq2"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.151697 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9b09-account-create-update-mtbq2"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.159627 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-bf9f-account-create-update-6plhb"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.173886 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-56e5-account-create-update-fjdzb"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.183920 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-bf9f-account-create-update-6plhb"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.187740 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-56e5-account-create-update-fjdzb"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.195542 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9e40-account-create-update-jcwj9"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.203619 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wgc9q"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.211512 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9e40-account-create-update-jcwj9"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.220769 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wgc9q"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.227065 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-8rx27"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.235543 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hbfv6"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.246543 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-8rx27"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.254160 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-htvtb"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.264073 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hbfv6"] Mar 17 01:38:54 crc kubenswrapper[4735]: I0317 01:38:54.272420 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-htvtb"] Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.093849 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="374187e8-9018-4baa-a078-1f80c9b4f0ff" path="/var/lib/kubelet/pods/374187e8-9018-4baa-a078-1f80c9b4f0ff/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.098370 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abe75a9-68d1-4233-a036-93f692c82544" path="/var/lib/kubelet/pods/6abe75a9-68d1-4233-a036-93f692c82544/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.104391 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e99284f-9b1b-408c-8d81-9562bcf5a449" path="/var/lib/kubelet/pods/7e99284f-9b1b-408c-8d81-9562bcf5a449/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.109123 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d71250-2369-4916-a9ba-1e5ecaa7ac76" path="/var/lib/kubelet/pods/97d71250-2369-4916-a9ba-1e5ecaa7ac76/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.112467 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988c6c3e-17e6-48d3-8428-3103523980a0" path="/var/lib/kubelet/pods/988c6c3e-17e6-48d3-8428-3103523980a0/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.116104 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55338f9-1081-4b02-b2a5-bddfd3d1f8c5" path="/var/lib/kubelet/pods/a55338f9-1081-4b02-b2a5-bddfd3d1f8c5/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.119589 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84207b1-d950-43ac-b9b9-315e32f3abce" path="/var/lib/kubelet/pods/a84207b1-d950-43ac-b9b9-315e32f3abce/volumes" Mar 17 01:38:55 crc kubenswrapper[4735]: I0317 01:38:55.122584 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9050259-4c72-4b29-8025-7ad50bc08910" path="/var/lib/kubelet/pods/f9050259-4c72-4b29-8025-7ad50bc08910/volumes" Mar 17 01:38:59 crc kubenswrapper[4735]: I0317 01:38:59.049913 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hx4sl"] Mar 17 01:38:59 crc kubenswrapper[4735]: I0317 01:38:59.062519 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hx4sl"] Mar 17 01:38:59 crc kubenswrapper[4735]: I0317 01:38:59.093124 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706bfd83-931b-46f6-9c7a-aa4e915cb054" path="/var/lib/kubelet/pods/706bfd83-931b-46f6-9c7a-aa4e915cb054/volumes" Mar 17 01:39:05 crc kubenswrapper[4735]: I0317 01:39:05.079392 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:39:05 crc kubenswrapper[4735]: E0317 01:39:05.080194 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:39:18 crc kubenswrapper[4735]: I0317 01:39:18.073282 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:39:18 crc kubenswrapper[4735]: E0317 01:39:18.074621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:39:33 crc kubenswrapper[4735]: I0317 01:39:33.074826 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:39:33 crc kubenswrapper[4735]: E0317 01:39:33.076104 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:39:37 crc kubenswrapper[4735]: I0317 01:39:37.070584 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-6v4rh"] Mar 17 01:39:37 crc kubenswrapper[4735]: I0317 01:39:37.093659 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-6v4rh"] Mar 17 01:39:39 crc kubenswrapper[4735]: I0317 01:39:39.086337 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef223ff-d0aa-44b8-b8cd-88242ceaee8e" path="/var/lib/kubelet/pods/2ef223ff-d0aa-44b8-b8cd-88242ceaee8e/volumes" Mar 17 01:39:45 crc kubenswrapper[4735]: I0317 01:39:45.092242 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:39:45 crc kubenswrapper[4735]: E0317 01:39:45.093450 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:39:48 crc kubenswrapper[4735]: I0317 01:39:48.052938 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cw9fr"] Mar 17 01:39:48 crc kubenswrapper[4735]: I0317 01:39:48.062612 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s6vdx"] Mar 17 01:39:48 crc kubenswrapper[4735]: I0317 01:39:48.083255 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cw9fr"] Mar 17 01:39:48 crc kubenswrapper[4735]: I0317 01:39:48.092557 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s6vdx"] Mar 17 01:39:49 crc kubenswrapper[4735]: I0317 01:39:49.039840 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-94fhf"] Mar 17 01:39:49 crc kubenswrapper[4735]: I0317 01:39:49.048654 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-94fhf"] Mar 17 01:39:49 crc kubenswrapper[4735]: I0317 01:39:49.087158 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322cb9be-148e-4dc0-8e9c-ae716ed6925f" path="/var/lib/kubelet/pods/322cb9be-148e-4dc0-8e9c-ae716ed6925f/volumes" Mar 17 01:39:49 crc kubenswrapper[4735]: I0317 01:39:49.090595 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4010b52e-ec5e-431e-9a94-48a03dfdcce6" path="/var/lib/kubelet/pods/4010b52e-ec5e-431e-9a94-48a03dfdcce6/volumes" Mar 17 01:39:49 crc kubenswrapper[4735]: I0317 01:39:49.093478 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5b000f-5d9d-4d80-b979-725e03851ba6" path="/var/lib/kubelet/pods/8c5b000f-5d9d-4d80-b979-725e03851ba6/volumes" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.246735 4735 scope.go:117] "RemoveContainer" containerID="577c06385ed64e451c78f2cf0eb3b6cc37f1e737f89318e8008e3cab095ecd05" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.276954 4735 scope.go:117] "RemoveContainer" containerID="c21e38ee0e20a18a4196446c8deb43608008ad76851d0decefa2fa915c851264" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.364173 4735 scope.go:117] "RemoveContainer" containerID="f74c1098ad49626659407059a2ec2ac6ecaf5ba8ef5c9460edc720b413828675" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.401806 4735 scope.go:117] "RemoveContainer" containerID="e3eeba2f5c376c0fd87d093ab9ebbfc8f61c30f847d61eccea94c58aa6db7eed" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.449157 4735 scope.go:117] "RemoveContainer" containerID="7a071a3cb2dbe118048ec162ac705f89d0d6f9b22b5f16a5d1b1e6ee98c86e43" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.487321 4735 scope.go:117] "RemoveContainer" containerID="062f033a7acde18e561f1b7cf13d5bffd51a2a1b919eca7ca1931179edbc6021" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.513407 4735 scope.go:117] "RemoveContainer" containerID="26f9ee346dc5c7523fb01f1805609575ccf514e9b54d884ebae6ca9193f32627" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.559436 4735 scope.go:117] "RemoveContainer" containerID="61ff74f9fd46346d6c44802a63bfa4e89c16c6ec7e9cc4c17a6acd0541931b28" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.580086 4735 scope.go:117] "RemoveContainer" containerID="751d91159dd856b28724425e05a5ff3e0cbd5e33dea057cbb77b9cb377b92956" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.605190 4735 scope.go:117] "RemoveContainer" containerID="0db7921a6390bef1e3f7459cef41cdf57a32b61369cbce149dd52ac72d3b329f" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.638223 4735 scope.go:117] "RemoveContainer" containerID="f1cf770ec3fdf44bba1dac1d20b6167c2c98c268035cf093e4b74cf7bdc4d92a" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.664632 4735 scope.go:117] "RemoveContainer" containerID="89af35daf3119b143b4d5ae39f194c8902d0d3bd76e39004d16929e78961c34e" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.687197 4735 scope.go:117] "RemoveContainer" containerID="b27cf6d8ef5ac74fa72905b2b3e8ebf15e28b25d68410183d6f6e677342e438e" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.710553 4735 scope.go:117] "RemoveContainer" containerID="68cb4547e382ebf1c68944841418b06d8ebdd75b3cce0044083697df9e27f405" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.733197 4735 scope.go:117] "RemoveContainer" containerID="b36ed64453ea7406e638f1eb444575a2fa6027ef986a26c950e8b304f397f2f6" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.763017 4735 scope.go:117] "RemoveContainer" containerID="c0ffda15eb08aa3552865f5ffa0080fdcaaf66a3d9f174cff1495bb9f782d27a" Mar 17 01:39:53 crc kubenswrapper[4735]: I0317 01:39:53.791072 4735 scope.go:117] "RemoveContainer" containerID="50843384154bbb40178842e70addad049293a918e0e745b0fbff961692eb171c" Mar 17 01:39:56 crc kubenswrapper[4735]: I0317 01:39:56.073527 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:39:56 crc kubenswrapper[4735]: E0317 01:39:56.074091 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.183352 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561860-t4kxz"] Mar 17 01:40:00 crc kubenswrapper[4735]: E0317 01:40:00.184109 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="extract-content" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.184120 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="extract-content" Mar 17 01:40:00 crc kubenswrapper[4735]: E0317 01:40:00.184138 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="registry-server" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.184144 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="registry-server" Mar 17 01:40:00 crc kubenswrapper[4735]: E0317 01:40:00.184165 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="extract-utilities" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.184171 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="extract-utilities" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.184334 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bab6a6-702e-4331-88a0-3ee8a9a1d009" containerName="registry-server" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.184941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.194881 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.198519 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.198719 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.199491 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-t4kxz"] Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.329627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glf95\" (UniqueName: \"kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95\") pod \"auto-csr-approver-29561860-t4kxz\" (UID: \"b9b21370-ca31-43b2-a5fb-78409f181a24\") " pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.431180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glf95\" (UniqueName: \"kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95\") pod \"auto-csr-approver-29561860-t4kxz\" (UID: \"b9b21370-ca31-43b2-a5fb-78409f181a24\") " pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.475139 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glf95\" (UniqueName: \"kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95\") pod \"auto-csr-approver-29561860-t4kxz\" (UID: \"b9b21370-ca31-43b2-a5fb-78409f181a24\") " pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:00 crc kubenswrapper[4735]: I0317 01:40:00.565530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:01 crc kubenswrapper[4735]: I0317 01:40:01.062350 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-t4kxz"] Mar 17 01:40:01 crc kubenswrapper[4735]: W0317 01:40:01.070074 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9b21370_ca31_43b2_a5fb_78409f181a24.slice/crio-2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5 WatchSource:0}: Error finding container 2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5: Status 404 returned error can't find the container with id 2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5 Mar 17 01:40:01 crc kubenswrapper[4735]: I0317 01:40:01.968198 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" event={"ID":"b9b21370-ca31-43b2-a5fb-78409f181a24","Type":"ContainerStarted","Data":"2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5"} Mar 17 01:40:02 crc kubenswrapper[4735]: I0317 01:40:02.980383 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9b21370-ca31-43b2-a5fb-78409f181a24" containerID="435841b462148c1e0951eacc109db40a377200217e3bab76ad5f12b55f0c1a2f" exitCode=0 Mar 17 01:40:02 crc kubenswrapper[4735]: I0317 01:40:02.980700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" event={"ID":"b9b21370-ca31-43b2-a5fb-78409f181a24","Type":"ContainerDied","Data":"435841b462148c1e0951eacc109db40a377200217e3bab76ad5f12b55f0c1a2f"} Mar 17 01:40:04 crc kubenswrapper[4735]: I0317 01:40:04.396345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:04 crc kubenswrapper[4735]: I0317 01:40:04.514250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glf95\" (UniqueName: \"kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95\") pod \"b9b21370-ca31-43b2-a5fb-78409f181a24\" (UID: \"b9b21370-ca31-43b2-a5fb-78409f181a24\") " Mar 17 01:40:04 crc kubenswrapper[4735]: I0317 01:40:04.525082 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95" (OuterVolumeSpecName: "kube-api-access-glf95") pod "b9b21370-ca31-43b2-a5fb-78409f181a24" (UID: "b9b21370-ca31-43b2-a5fb-78409f181a24"). InnerVolumeSpecName "kube-api-access-glf95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:40:04 crc kubenswrapper[4735]: I0317 01:40:04.616980 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glf95\" (UniqueName: \"kubernetes.io/projected/b9b21370-ca31-43b2-a5fb-78409f181a24-kube-api-access-glf95\") on node \"crc\" DevicePath \"\"" Mar 17 01:40:05 crc kubenswrapper[4735]: I0317 01:40:05.000634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" event={"ID":"b9b21370-ca31-43b2-a5fb-78409f181a24","Type":"ContainerDied","Data":"2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5"} Mar 17 01:40:05 crc kubenswrapper[4735]: I0317 01:40:05.000681 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-t4kxz" Mar 17 01:40:05 crc kubenswrapper[4735]: I0317 01:40:05.000699 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2190e08fdbc1fc73d30a757999e3c7964fb2cdd40d5c990f744e507bcc9739e5" Mar 17 01:40:05 crc kubenswrapper[4735]: I0317 01:40:05.488942 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-4vwb5"] Mar 17 01:40:05 crc kubenswrapper[4735]: I0317 01:40:05.500142 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-4vwb5"] Mar 17 01:40:07 crc kubenswrapper[4735]: I0317 01:40:07.097977 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d2c860-7702-4c17-b763-8a2da103ca6d" path="/var/lib/kubelet/pods/25d2c860-7702-4c17-b763-8a2da103ca6d/volumes" Mar 17 01:40:08 crc kubenswrapper[4735]: I0317 01:40:08.075536 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:40:08 crc kubenswrapper[4735]: E0317 01:40:08.076587 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:40:13 crc kubenswrapper[4735]: I0317 01:40:13.061044 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bl4z7"] Mar 17 01:40:13 crc kubenswrapper[4735]: I0317 01:40:13.068981 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bl4z7"] Mar 17 01:40:13 crc kubenswrapper[4735]: I0317 01:40:13.088808 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfd83df-2abe-43e9-ab0b-88ec269eb204" path="/var/lib/kubelet/pods/bcfd83df-2abe-43e9-ab0b-88ec269eb204/volumes" Mar 17 01:40:13 crc kubenswrapper[4735]: I0317 01:40:13.089797 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-97m76"] Mar 17 01:40:13 crc kubenswrapper[4735]: I0317 01:40:13.092742 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-97m76"] Mar 17 01:40:14 crc kubenswrapper[4735]: I0317 01:40:14.120542 4735 generic.go:334] "Generic (PLEG): container finished" podID="8697513f-4f89-4aad-9315-2e2053207433" containerID="e4963107766b99303c3674424ab9145c3abb148bf85d18696023a566937701ad" exitCode=0 Mar 17 01:40:14 crc kubenswrapper[4735]: I0317 01:40:14.120717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" event={"ID":"8697513f-4f89-4aad-9315-2e2053207433","Type":"ContainerDied","Data":"e4963107766b99303c3674424ab9145c3abb148bf85d18696023a566937701ad"} Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.092366 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f6de0a-1a49-4470-80f3-5c807d5899a4" path="/var/lib/kubelet/pods/f4f6de0a-1a49-4470-80f3-5c807d5899a4/volumes" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.592905 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.763981 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkns\" (UniqueName: \"kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns\") pod \"8697513f-4f89-4aad-9315-2e2053207433\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.764028 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory\") pod \"8697513f-4f89-4aad-9315-2e2053207433\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.764143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam\") pod \"8697513f-4f89-4aad-9315-2e2053207433\" (UID: \"8697513f-4f89-4aad-9315-2e2053207433\") " Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.776899 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns" (OuterVolumeSpecName: "kube-api-access-hwkns") pod "8697513f-4f89-4aad-9315-2e2053207433" (UID: "8697513f-4f89-4aad-9315-2e2053207433"). InnerVolumeSpecName "kube-api-access-hwkns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.794831 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory" (OuterVolumeSpecName: "inventory") pod "8697513f-4f89-4aad-9315-2e2053207433" (UID: "8697513f-4f89-4aad-9315-2e2053207433"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.806620 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8697513f-4f89-4aad-9315-2e2053207433" (UID: "8697513f-4f89-4aad-9315-2e2053207433"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.866371 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkns\" (UniqueName: \"kubernetes.io/projected/8697513f-4f89-4aad-9315-2e2053207433-kube-api-access-hwkns\") on node \"crc\" DevicePath \"\"" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.866406 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:40:15 crc kubenswrapper[4735]: I0317 01:40:15.866417 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697513f-4f89-4aad-9315-2e2053207433-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.145213 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.145129 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj" event={"ID":"8697513f-4f89-4aad-9315-2e2053207433","Type":"ContainerDied","Data":"d26e9d75b1a36c5682e0ff747f4bb12631834db5d984c20318d19a791e416dda"} Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.147633 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d26e9d75b1a36c5682e0ff747f4bb12631834db5d984c20318d19a791e416dda" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.254038 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp"] Mar 17 01:40:16 crc kubenswrapper[4735]: E0317 01:40:16.255069 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b21370-ca31-43b2-a5fb-78409f181a24" containerName="oc" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.255105 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b21370-ca31-43b2-a5fb-78409f181a24" containerName="oc" Mar 17 01:40:16 crc kubenswrapper[4735]: E0317 01:40:16.255142 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8697513f-4f89-4aad-9315-2e2053207433" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.255153 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8697513f-4f89-4aad-9315-2e2053207433" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.255601 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8697513f-4f89-4aad-9315-2e2053207433" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.255637 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b21370-ca31-43b2-a5fb-78409f181a24" containerName="oc" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.257258 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.261569 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.261806 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.261960 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.262084 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.265929 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp"] Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.385095 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.385378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.385560 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrp8n\" (UniqueName: \"kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.486449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.486550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.486745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrp8n\" (UniqueName: \"kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.492002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.492685 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.517734 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrp8n\" (UniqueName: \"kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:16 crc kubenswrapper[4735]: I0317 01:40:16.583069 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:40:17 crc kubenswrapper[4735]: I0317 01:40:17.193130 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp"] Mar 17 01:40:18 crc kubenswrapper[4735]: I0317 01:40:18.168821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" event={"ID":"ecd2012c-981d-4582-8dde-afedd29a4108","Type":"ContainerStarted","Data":"35d129d62e6604999deb3ae8c1181ccbd22cdb1cdcbdfacacbaf6f0d71ded4c0"} Mar 17 01:40:18 crc kubenswrapper[4735]: I0317 01:40:18.169508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" event={"ID":"ecd2012c-981d-4582-8dde-afedd29a4108","Type":"ContainerStarted","Data":"73821304111d48b0c21249bb3dab31d49f2f39c6b9b49c1bcc339718473f370a"} Mar 17 01:40:20 crc kubenswrapper[4735]: I0317 01:40:20.073166 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:40:20 crc kubenswrapper[4735]: E0317 01:40:20.074235 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:40:33 crc kubenswrapper[4735]: I0317 01:40:33.077177 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:40:33 crc kubenswrapper[4735]: E0317 01:40:33.079882 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:40:44 crc kubenswrapper[4735]: I0317 01:40:44.073978 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:40:44 crc kubenswrapper[4735]: E0317 01:40:44.075379 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:40:54 crc kubenswrapper[4735]: I0317 01:40:54.128143 4735 scope.go:117] "RemoveContainer" containerID="22790db9bae7a752fa7d704f810b22121b9b82d55b6ac4f75bff5e5f2172eeb3" Mar 17 01:40:54 crc kubenswrapper[4735]: I0317 01:40:54.164413 4735 scope.go:117] "RemoveContainer" containerID="74f69d396b063e0cc21e6cb1dd81e367276d64596eb0b8ad180810b3b0140457" Mar 17 01:40:54 crc kubenswrapper[4735]: I0317 01:40:54.243999 4735 scope.go:117] "RemoveContainer" containerID="871c342605904236a032b7990bc765f65b061a248331a3b5c1846c04f84be997" Mar 17 01:40:57 crc kubenswrapper[4735]: I0317 01:40:57.074422 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:40:57 crc kubenswrapper[4735]: E0317 01:40:57.074977 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:41:09 crc kubenswrapper[4735]: I0317 01:41:09.074799 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:41:09 crc kubenswrapper[4735]: E0317 01:41:09.075776 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.053139 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" podStartSLOduration=66.507391563 podStartE2EDuration="1m7.053121063s" podCreationTimestamp="2026-03-17 01:40:16 +0000 UTC" firstStartedPulling="2026-03-17 01:40:17.219007556 +0000 UTC m=+1842.851240544" lastFinishedPulling="2026-03-17 01:40:17.764737056 +0000 UTC m=+1843.396970044" observedRunningTime="2026-03-17 01:40:18.193891856 +0000 UTC m=+1843.826124874" watchObservedRunningTime="2026-03-17 01:41:23.053121063 +0000 UTC m=+1908.685354051" Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.067453 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-990a-account-create-update-6chjv"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.083156 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4cg78"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.085681 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0213-account-create-update-cls5l"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.101350 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4d4c-account-create-update-sc92d"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.108637 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-s8vgv"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.115272 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gxtdn"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.121584 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4cg78"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.127785 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gxtdn"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.136030 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-990a-account-create-update-6chjv"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.142982 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0213-account-create-update-cls5l"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.149494 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4d4c-account-create-update-sc92d"] Mar 17 01:41:23 crc kubenswrapper[4735]: I0317 01:41:23.156845 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-s8vgv"] Mar 17 01:41:24 crc kubenswrapper[4735]: I0317 01:41:24.073626 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:41:24 crc kubenswrapper[4735]: I0317 01:41:24.803301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c"} Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.093464 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02edf717-dfa9-4226-b695-f438427bc8a4" path="/var/lib/kubelet/pods/02edf717-dfa9-4226-b695-f438427bc8a4/volumes" Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.097638 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387441db-b51a-4a3d-b205-bb2f951a8700" path="/var/lib/kubelet/pods/387441db-b51a-4a3d-b205-bb2f951a8700/volumes" Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.102305 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7202ec2d-61ea-45a1-9992-e48fdf57e0db" path="/var/lib/kubelet/pods/7202ec2d-61ea-45a1-9992-e48fdf57e0db/volumes" Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.107513 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c1db3e-740b-434d-b7bf-1938df4f05e2" path="/var/lib/kubelet/pods/80c1db3e-740b-434d-b7bf-1938df4f05e2/volumes" Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.109492 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0148389-8934-4f76-8ba8-bd589163623d" path="/var/lib/kubelet/pods/b0148389-8934-4f76-8ba8-bd589163623d/volumes" Mar 17 01:41:25 crc kubenswrapper[4735]: I0317 01:41:25.113943 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c423ac16-ac60-4ad3-9a8e-a1e5be701162" path="/var/lib/kubelet/pods/c423ac16-ac60-4ad3-9a8e-a1e5be701162/volumes" Mar 17 01:41:29 crc kubenswrapper[4735]: I0317 01:41:29.865095 4735 generic.go:334] "Generic (PLEG): container finished" podID="ecd2012c-981d-4582-8dde-afedd29a4108" containerID="35d129d62e6604999deb3ae8c1181ccbd22cdb1cdcbdfacacbaf6f0d71ded4c0" exitCode=0 Mar 17 01:41:29 crc kubenswrapper[4735]: I0317 01:41:29.865190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" event={"ID":"ecd2012c-981d-4582-8dde-afedd29a4108","Type":"ContainerDied","Data":"35d129d62e6604999deb3ae8c1181ccbd22cdb1cdcbdfacacbaf6f0d71ded4c0"} Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.400012 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.516228 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory\") pod \"ecd2012c-981d-4582-8dde-afedd29a4108\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.516405 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrp8n\" (UniqueName: \"kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n\") pod \"ecd2012c-981d-4582-8dde-afedd29a4108\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.516604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam\") pod \"ecd2012c-981d-4582-8dde-afedd29a4108\" (UID: \"ecd2012c-981d-4582-8dde-afedd29a4108\") " Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.525139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n" (OuterVolumeSpecName: "kube-api-access-qrp8n") pod "ecd2012c-981d-4582-8dde-afedd29a4108" (UID: "ecd2012c-981d-4582-8dde-afedd29a4108"). InnerVolumeSpecName "kube-api-access-qrp8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.554696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ecd2012c-981d-4582-8dde-afedd29a4108" (UID: "ecd2012c-981d-4582-8dde-afedd29a4108"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.559566 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory" (OuterVolumeSpecName: "inventory") pod "ecd2012c-981d-4582-8dde-afedd29a4108" (UID: "ecd2012c-981d-4582-8dde-afedd29a4108"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.619899 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.619938 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecd2012c-981d-4582-8dde-afedd29a4108-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.619954 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrp8n\" (UniqueName: \"kubernetes.io/projected/ecd2012c-981d-4582-8dde-afedd29a4108-kube-api-access-qrp8n\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.892691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" event={"ID":"ecd2012c-981d-4582-8dde-afedd29a4108","Type":"ContainerDied","Data":"73821304111d48b0c21249bb3dab31d49f2f39c6b9b49c1bcc339718473f370a"} Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.892757 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73821304111d48b0c21249bb3dab31d49f2f39c6b9b49c1bcc339718473f370a" Mar 17 01:41:31 crc kubenswrapper[4735]: I0317 01:41:31.892818 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.006003 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5"] Mar 17 01:41:32 crc kubenswrapper[4735]: E0317 01:41:32.006347 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd2012c-981d-4582-8dde-afedd29a4108" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.006363 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd2012c-981d-4582-8dde-afedd29a4108" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.006545 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd2012c-981d-4582-8dde-afedd29a4108" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.007213 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.009955 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.010256 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.010472 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.010581 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.024212 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5"] Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.133059 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.133111 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4b9j\" (UniqueName: \"kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.133164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.234811 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.234906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4b9j\" (UniqueName: \"kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.234991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.239073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.244401 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.256624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4b9j\" (UniqueName: \"kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.331828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:32 crc kubenswrapper[4735]: W0317 01:41:32.854206 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979aea20_7a48_4bff_9188_685c664c6a78.slice/crio-72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe WatchSource:0}: Error finding container 72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe: Status 404 returned error can't find the container with id 72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.855258 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5"] Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.857081 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:41:32 crc kubenswrapper[4735]: I0317 01:41:32.903575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" event={"ID":"979aea20-7a48-4bff-9188-685c664c6a78","Type":"ContainerStarted","Data":"72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe"} Mar 17 01:41:33 crc kubenswrapper[4735]: I0317 01:41:33.917979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" event={"ID":"979aea20-7a48-4bff-9188-685c664c6a78","Type":"ContainerStarted","Data":"84d3b6b33e51c5183d52fa9b4dc3c800817ffbb2a6ec2db4a6cef9cb3fb6741c"} Mar 17 01:41:33 crc kubenswrapper[4735]: I0317 01:41:33.952807 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" podStartSLOduration=2.479264549 podStartE2EDuration="2.952782797s" podCreationTimestamp="2026-03-17 01:41:31 +0000 UTC" firstStartedPulling="2026-03-17 01:41:32.856563443 +0000 UTC m=+1918.488796441" lastFinishedPulling="2026-03-17 01:41:33.330081701 +0000 UTC m=+1918.962314689" observedRunningTime="2026-03-17 01:41:33.940632963 +0000 UTC m=+1919.572866021" watchObservedRunningTime="2026-03-17 01:41:33.952782797 +0000 UTC m=+1919.585015805" Mar 17 01:41:38 crc kubenswrapper[4735]: I0317 01:41:38.989566 4735 generic.go:334] "Generic (PLEG): container finished" podID="979aea20-7a48-4bff-9188-685c664c6a78" containerID="84d3b6b33e51c5183d52fa9b4dc3c800817ffbb2a6ec2db4a6cef9cb3fb6741c" exitCode=0 Mar 17 01:41:38 crc kubenswrapper[4735]: I0317 01:41:38.989651 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" event={"ID":"979aea20-7a48-4bff-9188-685c664c6a78","Type":"ContainerDied","Data":"84d3b6b33e51c5183d52fa9b4dc3c800817ffbb2a6ec2db4a6cef9cb3fb6741c"} Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.393307 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.543325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4b9j\" (UniqueName: \"kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j\") pod \"979aea20-7a48-4bff-9188-685c664c6a78\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.543903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory\") pod \"979aea20-7a48-4bff-9188-685c664c6a78\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.543931 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam\") pod \"979aea20-7a48-4bff-9188-685c664c6a78\" (UID: \"979aea20-7a48-4bff-9188-685c664c6a78\") " Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.553100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j" (OuterVolumeSpecName: "kube-api-access-h4b9j") pod "979aea20-7a48-4bff-9188-685c664c6a78" (UID: "979aea20-7a48-4bff-9188-685c664c6a78"). InnerVolumeSpecName "kube-api-access-h4b9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.577060 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory" (OuterVolumeSpecName: "inventory") pod "979aea20-7a48-4bff-9188-685c664c6a78" (UID: "979aea20-7a48-4bff-9188-685c664c6a78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.577624 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "979aea20-7a48-4bff-9188-685c664c6a78" (UID: "979aea20-7a48-4bff-9188-685c664c6a78"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.646339 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.646382 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/979aea20-7a48-4bff-9188-685c664c6a78-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:40 crc kubenswrapper[4735]: I0317 01:41:40.646398 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4b9j\" (UniqueName: \"kubernetes.io/projected/979aea20-7a48-4bff-9188-685c664c6a78-kube-api-access-h4b9j\") on node \"crc\" DevicePath \"\"" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.023225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" event={"ID":"979aea20-7a48-4bff-9188-685c664c6a78","Type":"ContainerDied","Data":"72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe"} Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.023285 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72c0deb92b0ed9d07c2a580920ec7e1aa266ca607019791ca6244c04e611e7fe" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.023307 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.142071 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b"] Mar 17 01:41:41 crc kubenswrapper[4735]: E0317 01:41:41.142503 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979aea20-7a48-4bff-9188-685c664c6a78" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.142528 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="979aea20-7a48-4bff-9188-685c664c6a78" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.142731 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="979aea20-7a48-4bff-9188-685c664c6a78" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.143412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.147091 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.147349 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.147503 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.150044 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.167176 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b"] Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.268957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqcpj\" (UniqueName: \"kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.269019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.269152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.370688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.370743 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqcpj\" (UniqueName: \"kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.370782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.377064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.378726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.385465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqcpj\" (UniqueName: \"kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pgc7b\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:41 crc kubenswrapper[4735]: I0317 01:41:41.473793 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:41:42 crc kubenswrapper[4735]: I0317 01:41:42.076940 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b"] Mar 17 01:41:43 crc kubenswrapper[4735]: I0317 01:41:43.061010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" event={"ID":"cb9f2661-6c58-4a56-8080-2701d2dc456a","Type":"ContainerStarted","Data":"a8a851ad5527aec4688cc89c5b02104f9fe191336ae40e30ad54d24b7aea8c5d"} Mar 17 01:41:43 crc kubenswrapper[4735]: I0317 01:41:43.061301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" event={"ID":"cb9f2661-6c58-4a56-8080-2701d2dc456a","Type":"ContainerStarted","Data":"a31a269e59847babcebd8eff15794d5c77cfc00c505f80d9920149bea191376a"} Mar 17 01:41:43 crc kubenswrapper[4735]: I0317 01:41:43.086419 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" podStartSLOduration=1.571160063 podStartE2EDuration="2.08640031s" podCreationTimestamp="2026-03-17 01:41:41 +0000 UTC" firstStartedPulling="2026-03-17 01:41:42.091043535 +0000 UTC m=+1927.723276523" lastFinishedPulling="2026-03-17 01:41:42.606283792 +0000 UTC m=+1928.238516770" observedRunningTime="2026-03-17 01:41:43.083455119 +0000 UTC m=+1928.715688177" watchObservedRunningTime="2026-03-17 01:41:43.08640031 +0000 UTC m=+1928.718633298" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.322840 4735 scope.go:117] "RemoveContainer" containerID="de0b6c23dc85d3427b45c37eaa4f6970377697ff178a86a90f5e81d31ec4fa84" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.370088 4735 scope.go:117] "RemoveContainer" containerID="26657117c1551f606db97aa91b33829d84fcf2132b80e63543db2cc90903f31f" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.411254 4735 scope.go:117] "RemoveContainer" containerID="1e877041438750a34b04eac5a19fdce50746446ec71044fcedf076ce749be50e" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.455890 4735 scope.go:117] "RemoveContainer" containerID="ee78b8cf6c1f046e3c87d5345bc812f9639f13d2a6dcdc7f00d688e9632a96bf" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.496950 4735 scope.go:117] "RemoveContainer" containerID="10c9beb590668ce43d1ee2c9b51153dbb13efc6ca2535ace55b3175da605f1a8" Mar 17 01:41:54 crc kubenswrapper[4735]: I0317 01:41:54.542519 4735 scope.go:117] "RemoveContainer" containerID="72919f521794f74d828599f9cde0ac0b3392679ea7db2e8a5b8e1cd7aae150f7" Mar 17 01:41:56 crc kubenswrapper[4735]: I0317 01:41:56.049038 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bb4vj"] Mar 17 01:41:56 crc kubenswrapper[4735]: I0317 01:41:56.062087 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bb4vj"] Mar 17 01:41:57 crc kubenswrapper[4735]: I0317 01:41:57.090355 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6275d08c-4457-4de4-aa66-98f5666568f5" path="/var/lib/kubelet/pods/6275d08c-4457-4de4-aa66-98f5666568f5/volumes" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.192318 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561862-6n6th"] Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.193619 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.196121 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.196180 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.198401 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.207678 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-6n6th"] Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.304457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vzhj\" (UniqueName: \"kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj\") pod \"auto-csr-approver-29561862-6n6th\" (UID: \"89fe46ef-0187-4f2e-9afb-b5885cba3dc7\") " pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.406403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vzhj\" (UniqueName: \"kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj\") pod \"auto-csr-approver-29561862-6n6th\" (UID: \"89fe46ef-0187-4f2e-9afb-b5885cba3dc7\") " pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.424358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vzhj\" (UniqueName: \"kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj\") pod \"auto-csr-approver-29561862-6n6th\" (UID: \"89fe46ef-0187-4f2e-9afb-b5885cba3dc7\") " pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.508433 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:00 crc kubenswrapper[4735]: I0317 01:42:00.946171 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-6n6th"] Mar 17 01:42:00 crc kubenswrapper[4735]: W0317 01:42:00.957137 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89fe46ef_0187_4f2e_9afb_b5885cba3dc7.slice/crio-2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038 WatchSource:0}: Error finding container 2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038: Status 404 returned error can't find the container with id 2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038 Mar 17 01:42:01 crc kubenswrapper[4735]: I0317 01:42:01.303605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-6n6th" event={"ID":"89fe46ef-0187-4f2e-9afb-b5885cba3dc7","Type":"ContainerStarted","Data":"2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038"} Mar 17 01:42:03 crc kubenswrapper[4735]: I0317 01:42:03.327335 4735 generic.go:334] "Generic (PLEG): container finished" podID="89fe46ef-0187-4f2e-9afb-b5885cba3dc7" containerID="89ff04345e4329c4b9c351e94b091d7fee63667d255f8a44356b7822b53548ad" exitCode=0 Mar 17 01:42:03 crc kubenswrapper[4735]: I0317 01:42:03.327438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-6n6th" event={"ID":"89fe46ef-0187-4f2e-9afb-b5885cba3dc7","Type":"ContainerDied","Data":"89ff04345e4329c4b9c351e94b091d7fee63667d255f8a44356b7822b53548ad"} Mar 17 01:42:04 crc kubenswrapper[4735]: I0317 01:42:04.754742 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:04 crc kubenswrapper[4735]: I0317 01:42:04.798644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vzhj\" (UniqueName: \"kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj\") pod \"89fe46ef-0187-4f2e-9afb-b5885cba3dc7\" (UID: \"89fe46ef-0187-4f2e-9afb-b5885cba3dc7\") " Mar 17 01:42:04 crc kubenswrapper[4735]: I0317 01:42:04.809035 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj" (OuterVolumeSpecName: "kube-api-access-9vzhj") pod "89fe46ef-0187-4f2e-9afb-b5885cba3dc7" (UID: "89fe46ef-0187-4f2e-9afb-b5885cba3dc7"). InnerVolumeSpecName "kube-api-access-9vzhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:42:04 crc kubenswrapper[4735]: I0317 01:42:04.901707 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vzhj\" (UniqueName: \"kubernetes.io/projected/89fe46ef-0187-4f2e-9afb-b5885cba3dc7-kube-api-access-9vzhj\") on node \"crc\" DevicePath \"\"" Mar 17 01:42:05 crc kubenswrapper[4735]: I0317 01:42:05.352130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-6n6th" event={"ID":"89fe46ef-0187-4f2e-9afb-b5885cba3dc7","Type":"ContainerDied","Data":"2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038"} Mar 17 01:42:05 crc kubenswrapper[4735]: I0317 01:42:05.352373 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7f8697e1ea11e2f1fa0b6961493c8697531074229be438eb2f711b4c0e2038" Mar 17 01:42:05 crc kubenswrapper[4735]: I0317 01:42:05.352347 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-6n6th" Mar 17 01:42:05 crc kubenswrapper[4735]: I0317 01:42:05.846200 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-jrmwm"] Mar 17 01:42:05 crc kubenswrapper[4735]: I0317 01:42:05.856678 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-jrmwm"] Mar 17 01:42:07 crc kubenswrapper[4735]: I0317 01:42:07.109118 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9596f2d-2d29-49d7-aed8-d0554e7ffcec" path="/var/lib/kubelet/pods/e9596f2d-2d29-49d7-aed8-d0554e7ffcec/volumes" Mar 17 01:42:20 crc kubenswrapper[4735]: I0317 01:42:20.039403 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-q4lh2"] Mar 17 01:42:20 crc kubenswrapper[4735]: I0317 01:42:20.050255 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-q4lh2"] Mar 17 01:42:21 crc kubenswrapper[4735]: I0317 01:42:21.084075 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9384926e-1432-4a7d-9a9a-10b05219cc9e" path="/var/lib/kubelet/pods/9384926e-1432-4a7d-9a9a-10b05219cc9e/volumes" Mar 17 01:42:22 crc kubenswrapper[4735]: I0317 01:42:22.050456 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j9k5n"] Mar 17 01:42:22 crc kubenswrapper[4735]: I0317 01:42:22.065707 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j9k5n"] Mar 17 01:42:23 crc kubenswrapper[4735]: I0317 01:42:23.086240 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05381036-1334-4a2b-a3ce-c64331ba0ebb" path="/var/lib/kubelet/pods/05381036-1334-4a2b-a3ce-c64331ba0ebb/volumes" Mar 17 01:42:24 crc kubenswrapper[4735]: I0317 01:42:24.531641 4735 generic.go:334] "Generic (PLEG): container finished" podID="cb9f2661-6c58-4a56-8080-2701d2dc456a" containerID="a8a851ad5527aec4688cc89c5b02104f9fe191336ae40e30ad54d24b7aea8c5d" exitCode=0 Mar 17 01:42:24 crc kubenswrapper[4735]: I0317 01:42:24.531976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" event={"ID":"cb9f2661-6c58-4a56-8080-2701d2dc456a","Type":"ContainerDied","Data":"a8a851ad5527aec4688cc89c5b02104f9fe191336ae40e30ad54d24b7aea8c5d"} Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.012773 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.070121 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory\") pod \"cb9f2661-6c58-4a56-8080-2701d2dc456a\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.070313 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqcpj\" (UniqueName: \"kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj\") pod \"cb9f2661-6c58-4a56-8080-2701d2dc456a\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.070420 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam\") pod \"cb9f2661-6c58-4a56-8080-2701d2dc456a\" (UID: \"cb9f2661-6c58-4a56-8080-2701d2dc456a\") " Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.076928 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj" (OuterVolumeSpecName: "kube-api-access-qqcpj") pod "cb9f2661-6c58-4a56-8080-2701d2dc456a" (UID: "cb9f2661-6c58-4a56-8080-2701d2dc456a"). InnerVolumeSpecName "kube-api-access-qqcpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.101164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb9f2661-6c58-4a56-8080-2701d2dc456a" (UID: "cb9f2661-6c58-4a56-8080-2701d2dc456a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.116205 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory" (OuterVolumeSpecName: "inventory") pod "cb9f2661-6c58-4a56-8080-2701d2dc456a" (UID: "cb9f2661-6c58-4a56-8080-2701d2dc456a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.172828 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.172878 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb9f2661-6c58-4a56-8080-2701d2dc456a-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.172888 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqcpj\" (UniqueName: \"kubernetes.io/projected/cb9f2661-6c58-4a56-8080-2701d2dc456a-kube-api-access-qqcpj\") on node \"crc\" DevicePath \"\"" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.552267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" event={"ID":"cb9f2661-6c58-4a56-8080-2701d2dc456a","Type":"ContainerDied","Data":"a31a269e59847babcebd8eff15794d5c77cfc00c505f80d9920149bea191376a"} Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.552623 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31a269e59847babcebd8eff15794d5c77cfc00c505f80d9920149bea191376a" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.552396 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pgc7b" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.631343 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77"] Mar 17 01:42:26 crc kubenswrapper[4735]: E0317 01:42:26.631677 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fe46ef-0187-4f2e-9afb-b5885cba3dc7" containerName="oc" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.631696 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fe46ef-0187-4f2e-9afb-b5885cba3dc7" containerName="oc" Mar 17 01:42:26 crc kubenswrapper[4735]: E0317 01:42:26.631718 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9f2661-6c58-4a56-8080-2701d2dc456a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.631725 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9f2661-6c58-4a56-8080-2701d2dc456a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.631884 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9f2661-6c58-4a56-8080-2701d2dc456a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.631913 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fe46ef-0187-4f2e-9afb-b5885cba3dc7" containerName="oc" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.636975 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.639180 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.641831 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.642059 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.642216 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.650376 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77"] Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.695232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbwrn\" (UniqueName: \"kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.695378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.695418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.796636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.796702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.796815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbwrn\" (UniqueName: \"kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.801509 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.810279 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.815393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbwrn\" (UniqueName: \"kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9jg77\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:26 crc kubenswrapper[4735]: I0317 01:42:26.958425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:42:27 crc kubenswrapper[4735]: I0317 01:42:27.570157 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77"] Mar 17 01:42:28 crc kubenswrapper[4735]: I0317 01:42:28.570757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" event={"ID":"eb6e1c65-c874-45d5-9c38-203b56f1385c","Type":"ContainerStarted","Data":"11fa1cf0dec5313b2f9402c694068a65746d53b7a5127b0489a49ec5a6ffc23f"} Mar 17 01:42:28 crc kubenswrapper[4735]: I0317 01:42:28.571424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" event={"ID":"eb6e1c65-c874-45d5-9c38-203b56f1385c","Type":"ContainerStarted","Data":"e68e140a7799fc569ffe48faa05e37c2a37d9be6b3ab51b6cab82680b9e3395a"} Mar 17 01:42:28 crc kubenswrapper[4735]: I0317 01:42:28.593919 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" podStartSLOduration=2.192603202 podStartE2EDuration="2.593899773s" podCreationTimestamp="2026-03-17 01:42:26 +0000 UTC" firstStartedPulling="2026-03-17 01:42:27.580634236 +0000 UTC m=+1973.212867214" lastFinishedPulling="2026-03-17 01:42:27.981930807 +0000 UTC m=+1973.614163785" observedRunningTime="2026-03-17 01:42:28.589027236 +0000 UTC m=+1974.221260214" watchObservedRunningTime="2026-03-17 01:42:28.593899773 +0000 UTC m=+1974.226132761" Mar 17 01:42:54 crc kubenswrapper[4735]: I0317 01:42:54.704372 4735 scope.go:117] "RemoveContainer" containerID="cf2442c3c95bbc6e66efc775ac7000bd8eb6691858efefbbbbe2e3ab5a20d394" Mar 17 01:42:54 crc kubenswrapper[4735]: I0317 01:42:54.752526 4735 scope.go:117] "RemoveContainer" containerID="35beb1b65b3034503734723db98ea8fe9928dfa05c4c44756f9f196abcf7969b" Mar 17 01:42:54 crc kubenswrapper[4735]: I0317 01:42:54.797026 4735 scope.go:117] "RemoveContainer" containerID="5292ec403c041d4f4968f78f419f67d592c822306f045f6a00310150735ab213" Mar 17 01:42:54 crc kubenswrapper[4735]: I0317 01:42:54.826816 4735 scope.go:117] "RemoveContainer" containerID="45d87ecc4d8420b78d004cbf9ec049f64406af1fa1707f26bf2eb1c1d1ed3cac" Mar 17 01:43:05 crc kubenswrapper[4735]: I0317 01:43:05.059393 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rxr77"] Mar 17 01:43:05 crc kubenswrapper[4735]: I0317 01:43:05.090791 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rxr77"] Mar 17 01:43:07 crc kubenswrapper[4735]: I0317 01:43:07.088762 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de93197b-d6c2-444b-a532-87f1534094c3" path="/var/lib/kubelet/pods/de93197b-d6c2-444b-a532-87f1534094c3/volumes" Mar 17 01:43:23 crc kubenswrapper[4735]: I0317 01:43:23.096503 4735 generic.go:334] "Generic (PLEG): container finished" podID="eb6e1c65-c874-45d5-9c38-203b56f1385c" containerID="11fa1cf0dec5313b2f9402c694068a65746d53b7a5127b0489a49ec5a6ffc23f" exitCode=0 Mar 17 01:43:23 crc kubenswrapper[4735]: I0317 01:43:23.096639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" event={"ID":"eb6e1c65-c874-45d5-9c38-203b56f1385c","Type":"ContainerDied","Data":"11fa1cf0dec5313b2f9402c694068a65746d53b7a5127b0489a49ec5a6ffc23f"} Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.637475 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.697818 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam\") pod \"eb6e1c65-c874-45d5-9c38-203b56f1385c\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.697904 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory\") pod \"eb6e1c65-c874-45d5-9c38-203b56f1385c\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.697974 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbwrn\" (UniqueName: \"kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn\") pod \"eb6e1c65-c874-45d5-9c38-203b56f1385c\" (UID: \"eb6e1c65-c874-45d5-9c38-203b56f1385c\") " Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.707156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn" (OuterVolumeSpecName: "kube-api-access-gbwrn") pod "eb6e1c65-c874-45d5-9c38-203b56f1385c" (UID: "eb6e1c65-c874-45d5-9c38-203b56f1385c"). InnerVolumeSpecName "kube-api-access-gbwrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.727159 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory" (OuterVolumeSpecName: "inventory") pod "eb6e1c65-c874-45d5-9c38-203b56f1385c" (UID: "eb6e1c65-c874-45d5-9c38-203b56f1385c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.744124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eb6e1c65-c874-45d5-9c38-203b56f1385c" (UID: "eb6e1c65-c874-45d5-9c38-203b56f1385c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.800817 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbwrn\" (UniqueName: \"kubernetes.io/projected/eb6e1c65-c874-45d5-9c38-203b56f1385c-kube-api-access-gbwrn\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.801046 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:24 crc kubenswrapper[4735]: I0317 01:43:24.801061 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb6e1c65-c874-45d5-9c38-203b56f1385c-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.124228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" event={"ID":"eb6e1c65-c874-45d5-9c38-203b56f1385c","Type":"ContainerDied","Data":"e68e140a7799fc569ffe48faa05e37c2a37d9be6b3ab51b6cab82680b9e3395a"} Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.124289 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68e140a7799fc569ffe48faa05e37c2a37d9be6b3ab51b6cab82680b9e3395a" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.124292 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9jg77" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.232924 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4j8fl"] Mar 17 01:43:25 crc kubenswrapper[4735]: E0317 01:43:25.233490 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6e1c65-c874-45d5-9c38-203b56f1385c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.233505 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6e1c65-c874-45d5-9c38-203b56f1385c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.233807 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6e1c65-c874-45d5-9c38-203b56f1385c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.234625 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.240008 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4j8fl"] Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.240976 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.241141 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.241364 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.244784 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.323252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.323403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.323516 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmg6b\" (UniqueName: \"kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.426021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.426348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.426465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmg6b\" (UniqueName: \"kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.430018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.436675 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.450073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmg6b\" (UniqueName: \"kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b\") pod \"ssh-known-hosts-edpm-deployment-4j8fl\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:25 crc kubenswrapper[4735]: I0317 01:43:25.552680 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:26 crc kubenswrapper[4735]: I0317 01:43:26.091453 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4j8fl"] Mar 17 01:43:26 crc kubenswrapper[4735]: I0317 01:43:26.132490 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" event={"ID":"62ab1715-a702-460a-9195-4646d98e2620","Type":"ContainerStarted","Data":"6a18672afa01726254b9738e7871f062424354f3e956b3f23b33bf27df1d9f48"} Mar 17 01:43:27 crc kubenswrapper[4735]: I0317 01:43:27.143388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" event={"ID":"62ab1715-a702-460a-9195-4646d98e2620","Type":"ContainerStarted","Data":"d52828a07e33a2cd72db0a0ff6f92a8e9dcfedfc2d713e7c2404b8474a3862d9"} Mar 17 01:43:27 crc kubenswrapper[4735]: I0317 01:43:27.168891 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" podStartSLOduration=1.605997753 podStartE2EDuration="2.168847781s" podCreationTimestamp="2026-03-17 01:43:25 +0000 UTC" firstStartedPulling="2026-03-17 01:43:26.088228676 +0000 UTC m=+2031.720461654" lastFinishedPulling="2026-03-17 01:43:26.651078664 +0000 UTC m=+2032.283311682" observedRunningTime="2026-03-17 01:43:27.163652286 +0000 UTC m=+2032.795885264" watchObservedRunningTime="2026-03-17 01:43:27.168847781 +0000 UTC m=+2032.801080749" Mar 17 01:43:35 crc kubenswrapper[4735]: I0317 01:43:35.228816 4735 generic.go:334] "Generic (PLEG): container finished" podID="62ab1715-a702-460a-9195-4646d98e2620" containerID="d52828a07e33a2cd72db0a0ff6f92a8e9dcfedfc2d713e7c2404b8474a3862d9" exitCode=0 Mar 17 01:43:35 crc kubenswrapper[4735]: I0317 01:43:35.228841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" event={"ID":"62ab1715-a702-460a-9195-4646d98e2620","Type":"ContainerDied","Data":"d52828a07e33a2cd72db0a0ff6f92a8e9dcfedfc2d713e7c2404b8474a3862d9"} Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.651007 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.804738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmg6b\" (UniqueName: \"kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b\") pod \"62ab1715-a702-460a-9195-4646d98e2620\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.804895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0\") pod \"62ab1715-a702-460a-9195-4646d98e2620\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.804937 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam\") pod \"62ab1715-a702-460a-9195-4646d98e2620\" (UID: \"62ab1715-a702-460a-9195-4646d98e2620\") " Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.817723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b" (OuterVolumeSpecName: "kube-api-access-cmg6b") pod "62ab1715-a702-460a-9195-4646d98e2620" (UID: "62ab1715-a702-460a-9195-4646d98e2620"). InnerVolumeSpecName "kube-api-access-cmg6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.834186 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "62ab1715-a702-460a-9195-4646d98e2620" (UID: "62ab1715-a702-460a-9195-4646d98e2620"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.858178 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62ab1715-a702-460a-9195-4646d98e2620" (UID: "62ab1715-a702-460a-9195-4646d98e2620"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.908278 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmg6b\" (UniqueName: \"kubernetes.io/projected/62ab1715-a702-460a-9195-4646d98e2620-kube-api-access-cmg6b\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.908345 4735 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:36 crc kubenswrapper[4735]: I0317 01:43:36.908371 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ab1715-a702-460a-9195-4646d98e2620-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.251730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" event={"ID":"62ab1715-a702-460a-9195-4646d98e2620","Type":"ContainerDied","Data":"6a18672afa01726254b9738e7871f062424354f3e956b3f23b33bf27df1d9f48"} Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.251790 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a18672afa01726254b9738e7871f062424354f3e956b3f23b33bf27df1d9f48" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.251907 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4j8fl" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.389389 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq"] Mar 17 01:43:37 crc kubenswrapper[4735]: E0317 01:43:37.389916 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ab1715-a702-460a-9195-4646d98e2620" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.389930 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ab1715-a702-460a-9195-4646d98e2620" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.390181 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ab1715-a702-460a-9195-4646d98e2620" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.390964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.397526 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.398238 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.398408 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.398588 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.400660 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq"] Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.540149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.541505 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.541589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48btm\" (UniqueName: \"kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.644134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.644259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.644282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48btm\" (UniqueName: \"kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.651680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.657404 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.660976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48btm\" (UniqueName: \"kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p8mnq\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:37 crc kubenswrapper[4735]: I0317 01:43:37.726552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:38 crc kubenswrapper[4735]: I0317 01:43:38.284225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq"] Mar 17 01:43:39 crc kubenswrapper[4735]: I0317 01:43:39.283179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" event={"ID":"6136d26b-0db1-42a3-80df-aac1dd6daf50","Type":"ContainerStarted","Data":"01a66494812488e3bd1d36aa4a39d20a4e0fdb27907837c11b6312f7c6f96562"} Mar 17 01:43:39 crc kubenswrapper[4735]: I0317 01:43:39.283661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" event={"ID":"6136d26b-0db1-42a3-80df-aac1dd6daf50","Type":"ContainerStarted","Data":"0867b710319bf16da41fafd4f77e1e8031fbd69d6a7a5d09e3792a94be8e02bb"} Mar 17 01:43:39 crc kubenswrapper[4735]: I0317 01:43:39.307930 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" podStartSLOduration=1.879119303 podStartE2EDuration="2.30791293s" podCreationTimestamp="2026-03-17 01:43:37 +0000 UTC" firstStartedPulling="2026-03-17 01:43:38.291967658 +0000 UTC m=+2043.924200646" lastFinishedPulling="2026-03-17 01:43:38.720761255 +0000 UTC m=+2044.352994273" observedRunningTime="2026-03-17 01:43:39.301937426 +0000 UTC m=+2044.934170434" watchObservedRunningTime="2026-03-17 01:43:39.30791293 +0000 UTC m=+2044.940145908" Mar 17 01:43:42 crc kubenswrapper[4735]: I0317 01:43:42.607180 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:43:42 crc kubenswrapper[4735]: I0317 01:43:42.607471 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:43:48 crc kubenswrapper[4735]: I0317 01:43:48.386965 4735 generic.go:334] "Generic (PLEG): container finished" podID="6136d26b-0db1-42a3-80df-aac1dd6daf50" containerID="01a66494812488e3bd1d36aa4a39d20a4e0fdb27907837c11b6312f7c6f96562" exitCode=0 Mar 17 01:43:48 crc kubenswrapper[4735]: I0317 01:43:48.387068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" event={"ID":"6136d26b-0db1-42a3-80df-aac1dd6daf50","Type":"ContainerDied","Data":"01a66494812488e3bd1d36aa4a39d20a4e0fdb27907837c11b6312f7c6f96562"} Mar 17 01:43:49 crc kubenswrapper[4735]: I0317 01:43:49.920586 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.116659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48btm\" (UniqueName: \"kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm\") pod \"6136d26b-0db1-42a3-80df-aac1dd6daf50\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.116827 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory\") pod \"6136d26b-0db1-42a3-80df-aac1dd6daf50\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.116904 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam\") pod \"6136d26b-0db1-42a3-80df-aac1dd6daf50\" (UID: \"6136d26b-0db1-42a3-80df-aac1dd6daf50\") " Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.124148 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm" (OuterVolumeSpecName: "kube-api-access-48btm") pod "6136d26b-0db1-42a3-80df-aac1dd6daf50" (UID: "6136d26b-0db1-42a3-80df-aac1dd6daf50"). InnerVolumeSpecName "kube-api-access-48btm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.154976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory" (OuterVolumeSpecName: "inventory") pod "6136d26b-0db1-42a3-80df-aac1dd6daf50" (UID: "6136d26b-0db1-42a3-80df-aac1dd6daf50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.172240 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6136d26b-0db1-42a3-80df-aac1dd6daf50" (UID: "6136d26b-0db1-42a3-80df-aac1dd6daf50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.219164 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48btm\" (UniqueName: \"kubernetes.io/projected/6136d26b-0db1-42a3-80df-aac1dd6daf50-kube-api-access-48btm\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.219225 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.219238 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6136d26b-0db1-42a3-80df-aac1dd6daf50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.412031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" event={"ID":"6136d26b-0db1-42a3-80df-aac1dd6daf50","Type":"ContainerDied","Data":"0867b710319bf16da41fafd4f77e1e8031fbd69d6a7a5d09e3792a94be8e02bb"} Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.412077 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0867b710319bf16da41fafd4f77e1e8031fbd69d6a7a5d09e3792a94be8e02bb" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.412126 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p8mnq" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.504723 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9"] Mar 17 01:43:50 crc kubenswrapper[4735]: E0317 01:43:50.505430 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6136d26b-0db1-42a3-80df-aac1dd6daf50" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.505449 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6136d26b-0db1-42a3-80df-aac1dd6daf50" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.505661 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6136d26b-0db1-42a3-80df-aac1dd6daf50" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.506414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.509170 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.509428 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.509886 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.510380 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.516839 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9"] Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.525903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.525997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.526086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdsm\" (UniqueName: \"kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.628572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.628645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.628710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdsm\" (UniqueName: \"kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.641938 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.643993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.649816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdsm\" (UniqueName: \"kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:50 crc kubenswrapper[4735]: I0317 01:43:50.825523 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:43:51 crc kubenswrapper[4735]: I0317 01:43:51.473584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9"] Mar 17 01:43:51 crc kubenswrapper[4735]: W0317 01:43:51.481496 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7a18af_48a3_47f8_8318_e6070738d82f.slice/crio-ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a WatchSource:0}: Error finding container ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a: Status 404 returned error can't find the container with id ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a Mar 17 01:43:52 crc kubenswrapper[4735]: I0317 01:43:52.435525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" event={"ID":"2a7a18af-48a3-47f8-8318-e6070738d82f","Type":"ContainerStarted","Data":"94afedb24a6b7120525a108922f466e711d4b3286f9c66b7f917debe6b32ac20"} Mar 17 01:43:52 crc kubenswrapper[4735]: I0317 01:43:52.435833 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" event={"ID":"2a7a18af-48a3-47f8-8318-e6070738d82f","Type":"ContainerStarted","Data":"ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a"} Mar 17 01:43:52 crc kubenswrapper[4735]: I0317 01:43:52.463118 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" podStartSLOduration=2.028507527 podStartE2EDuration="2.463093065s" podCreationTimestamp="2026-03-17 01:43:50 +0000 UTC" firstStartedPulling="2026-03-17 01:43:51.483631764 +0000 UTC m=+2057.115864742" lastFinishedPulling="2026-03-17 01:43:51.918217262 +0000 UTC m=+2057.550450280" observedRunningTime="2026-03-17 01:43:52.454391285 +0000 UTC m=+2058.086624263" watchObservedRunningTime="2026-03-17 01:43:52.463093065 +0000 UTC m=+2058.095326083" Mar 17 01:43:54 crc kubenswrapper[4735]: I0317 01:43:54.968972 4735 scope.go:117] "RemoveContainer" containerID="53a61edfce07e1916ab0378827250762dd542547d73f9c3573823b20be6aa4d6" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.151351 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561864-cjrqv"] Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.153795 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.156146 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.157554 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.158713 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.166528 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-cjrqv"] Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.236198 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjgv\" (UniqueName: \"kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv\") pod \"auto-csr-approver-29561864-cjrqv\" (UID: \"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8\") " pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.338154 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjgv\" (UniqueName: \"kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv\") pod \"auto-csr-approver-29561864-cjrqv\" (UID: \"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8\") " pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.366365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjgv\" (UniqueName: \"kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv\") pod \"auto-csr-approver-29561864-cjrqv\" (UID: \"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8\") " pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:00 crc kubenswrapper[4735]: I0317 01:44:00.483472 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:01 crc kubenswrapper[4735]: I0317 01:44:01.004074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-cjrqv"] Mar 17 01:44:01 crc kubenswrapper[4735]: I0317 01:44:01.538202 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" event={"ID":"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8","Type":"ContainerStarted","Data":"b11ea2de26f4189460d3f9c041ea8eb11fc40c96ab5d25839242d3c32c708227"} Mar 17 01:44:02 crc kubenswrapper[4735]: I0317 01:44:02.548065 4735 generic.go:334] "Generic (PLEG): container finished" podID="bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" containerID="4ce4033d72149f893dbe7947805d5f254b63d96c0b5c328b882a97b771257c82" exitCode=0 Mar 17 01:44:02 crc kubenswrapper[4735]: I0317 01:44:02.548121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" event={"ID":"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8","Type":"ContainerDied","Data":"4ce4033d72149f893dbe7947805d5f254b63d96c0b5c328b882a97b771257c82"} Mar 17 01:44:02 crc kubenswrapper[4735]: I0317 01:44:02.550763 4735 generic.go:334] "Generic (PLEG): container finished" podID="2a7a18af-48a3-47f8-8318-e6070738d82f" containerID="94afedb24a6b7120525a108922f466e711d4b3286f9c66b7f917debe6b32ac20" exitCode=0 Mar 17 01:44:02 crc kubenswrapper[4735]: I0317 01:44:02.550809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" event={"ID":"2a7a18af-48a3-47f8-8318-e6070738d82f","Type":"ContainerDied","Data":"94afedb24a6b7120525a108922f466e711d4b3286f9c66b7f917debe6b32ac20"} Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.002198 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.059360 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjgv\" (UniqueName: \"kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv\") pod \"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8\" (UID: \"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8\") " Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.076029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv" (OuterVolumeSpecName: "kube-api-access-vfjgv") pod "bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" (UID: "bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8"). InnerVolumeSpecName "kube-api-access-vfjgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.162450 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjgv\" (UniqueName: \"kubernetes.io/projected/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8-kube-api-access-vfjgv\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.166671 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.264046 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory\") pod \"2a7a18af-48a3-47f8-8318-e6070738d82f\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.264154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam\") pod \"2a7a18af-48a3-47f8-8318-e6070738d82f\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.264215 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdsm\" (UniqueName: \"kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm\") pod \"2a7a18af-48a3-47f8-8318-e6070738d82f\" (UID: \"2a7a18af-48a3-47f8-8318-e6070738d82f\") " Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.268440 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm" (OuterVolumeSpecName: "kube-api-access-dxdsm") pod "2a7a18af-48a3-47f8-8318-e6070738d82f" (UID: "2a7a18af-48a3-47f8-8318-e6070738d82f"). InnerVolumeSpecName "kube-api-access-dxdsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.289600 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory" (OuterVolumeSpecName: "inventory") pod "2a7a18af-48a3-47f8-8318-e6070738d82f" (UID: "2a7a18af-48a3-47f8-8318-e6070738d82f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.290214 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a7a18af-48a3-47f8-8318-e6070738d82f" (UID: "2a7a18af-48a3-47f8-8318-e6070738d82f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.367007 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.367036 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7a18af-48a3-47f8-8318-e6070738d82f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.367047 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdsm\" (UniqueName: \"kubernetes.io/projected/2a7a18af-48a3-47f8-8318-e6070738d82f-kube-api-access-dxdsm\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.575352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" event={"ID":"2a7a18af-48a3-47f8-8318-e6070738d82f","Type":"ContainerDied","Data":"ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a"} Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.575396 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce99dc45e052c49c11a3ae1af646c34d4f08bf178a4bb0f2f05a620b8532433a" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.575509 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.576999 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" event={"ID":"bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8","Type":"ContainerDied","Data":"b11ea2de26f4189460d3f9c041ea8eb11fc40c96ab5d25839242d3c32c708227"} Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.577029 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11ea2de26f4189460d3f9c041ea8eb11fc40c96ab5d25839242d3c32c708227" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.577084 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-cjrqv" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.692687 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk"] Mar 17 01:44:04 crc kubenswrapper[4735]: E0317 01:44:04.693363 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7a18af-48a3-47f8-8318-e6070738d82f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.693383 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7a18af-48a3-47f8-8318-e6070738d82f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:04 crc kubenswrapper[4735]: E0317 01:44:04.693420 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" containerName="oc" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.693426 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" containerName="oc" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.693604 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" containerName="oc" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.693614 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7a18af-48a3-47f8-8318-e6070738d82f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.694262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.700190 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.701428 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.701804 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.702143 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.702253 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.702756 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.703176 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.708660 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.716323 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk"] Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776722 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.776977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777003 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777042 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rps26\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777199 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.777237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.878947 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.878995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879192 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879218 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rps26\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879416 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.879450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.885134 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.886614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.887170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.887345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.887959 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.888099 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.888420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.888536 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.888534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.889686 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.890412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.890463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.898710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:04 crc kubenswrapper[4735]: I0317 01:44:04.901298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rps26\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:05 crc kubenswrapper[4735]: I0317 01:44:05.012995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:05 crc kubenswrapper[4735]: I0317 01:44:05.086935 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-hrpqm"] Mar 17 01:44:05 crc kubenswrapper[4735]: I0317 01:44:05.087263 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-hrpqm"] Mar 17 01:44:05 crc kubenswrapper[4735]: I0317 01:44:05.552453 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk"] Mar 17 01:44:05 crc kubenswrapper[4735]: I0317 01:44:05.586056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" event={"ID":"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65","Type":"ContainerStarted","Data":"6fdfcc489554cd5d643efa99bb45dc43e37416f73c75972026ac9f01c2807494"} Mar 17 01:44:06 crc kubenswrapper[4735]: I0317 01:44:06.597087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" event={"ID":"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65","Type":"ContainerStarted","Data":"514ebf9f3ce50930efe91fe14a49769089e55e3f98fac357ccadd8bdf6dbef29"} Mar 17 01:44:06 crc kubenswrapper[4735]: I0317 01:44:06.623228 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" podStartSLOduration=2.168423923 podStartE2EDuration="2.623210307s" podCreationTimestamp="2026-03-17 01:44:04 +0000 UTC" firstStartedPulling="2026-03-17 01:44:05.572064604 +0000 UTC m=+2071.204297582" lastFinishedPulling="2026-03-17 01:44:06.026850948 +0000 UTC m=+2071.659083966" observedRunningTime="2026-03-17 01:44:06.623209797 +0000 UTC m=+2072.255442775" watchObservedRunningTime="2026-03-17 01:44:06.623210307 +0000 UTC m=+2072.255443275" Mar 17 01:44:07 crc kubenswrapper[4735]: I0317 01:44:07.092909 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cb487f-a9aa-4d54-8da1-7b45eae1cff2" path="/var/lib/kubelet/pods/a4cb487f-a9aa-4d54-8da1-7b45eae1cff2/volumes" Mar 17 01:44:12 crc kubenswrapper[4735]: I0317 01:44:12.606914 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:44:12 crc kubenswrapper[4735]: I0317 01:44:12.607657 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.612832 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.613837 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.614314 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.615944 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.616077 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c" gracePeriod=600 Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.962769 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c" exitCode=0 Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.962909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c"} Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.963106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb"} Mar 17 01:44:42 crc kubenswrapper[4735]: I0317 01:44:42.963129 4735 scope.go:117] "RemoveContainer" containerID="ca37481812a445789f25756bcc8ffae90e1024edc3b2b85ffee7894f4fe7c5ba" Mar 17 01:44:45 crc kubenswrapper[4735]: I0317 01:44:45.994110 4735 generic.go:334] "Generic (PLEG): container finished" podID="9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" containerID="514ebf9f3ce50930efe91fe14a49769089e55e3f98fac357ccadd8bdf6dbef29" exitCode=0 Mar 17 01:44:45 crc kubenswrapper[4735]: I0317 01:44:45.994229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" event={"ID":"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65","Type":"ContainerDied","Data":"514ebf9f3ce50930efe91fe14a49769089e55e3f98fac357ccadd8bdf6dbef29"} Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.462664 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.601723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.601801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.601835 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.601962 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602015 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rps26\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602149 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602286 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.602408 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle\") pod \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\" (UID: \"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65\") " Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.610733 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.612341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.612514 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.612751 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.614523 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.615280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.615676 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.616984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.618054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.620086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.624164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26" (OuterVolumeSpecName: "kube-api-access-rps26") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "kube-api-access-rps26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.626225 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.640146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.650513 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory" (OuterVolumeSpecName: "inventory") pod "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" (UID: "9853dc02-da1f-43e7-ab8a-aa9aeae6ff65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705034 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705066 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705076 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705089 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705100 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705110 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rps26\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-kube-api-access-rps26\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705119 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705127 4735 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705135 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705144 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705153 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705165 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705174 4735 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:47 crc kubenswrapper[4735]: I0317 01:44:47.705185 4735 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9853dc02-da1f-43e7-ab8a-aa9aeae6ff65-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.013677 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" event={"ID":"9853dc02-da1f-43e7-ab8a-aa9aeae6ff65","Type":"ContainerDied","Data":"6fdfcc489554cd5d643efa99bb45dc43e37416f73c75972026ac9f01c2807494"} Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.013743 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fdfcc489554cd5d643efa99bb45dc43e37416f73c75972026ac9f01c2807494" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.013761 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.241921 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn"] Mar 17 01:44:48 crc kubenswrapper[4735]: E0317 01:44:48.242494 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.242561 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.242823 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9853dc02-da1f-43e7-ab8a-aa9aeae6ff65" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.243466 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.250826 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.251028 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.251136 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.251644 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.251811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.268791 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn"] Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.419687 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2gt\" (UniqueName: \"kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.419725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.419747 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.419770 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.419826 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.522198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.522449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.522493 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2gt\" (UniqueName: \"kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.522535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.522584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.524736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.527267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.535466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.539479 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2gt\" (UniqueName: \"kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.539836 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pwsgn\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:48 crc kubenswrapper[4735]: I0317 01:44:48.585398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:44:49 crc kubenswrapper[4735]: I0317 01:44:49.006743 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn"] Mar 17 01:44:50 crc kubenswrapper[4735]: I0317 01:44:50.031919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" event={"ID":"039a6699-7fc5-48c1-89b0-0e3946f69349","Type":"ContainerStarted","Data":"b3b1c80e20e62fcb2d62391aa2e590baaa695267539f3798ad10ffccdaadff29"} Mar 17 01:44:50 crc kubenswrapper[4735]: I0317 01:44:50.032182 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" event={"ID":"039a6699-7fc5-48c1-89b0-0e3946f69349","Type":"ContainerStarted","Data":"0786442e08490fa2f40b60cc51a13e07d1797d92887e2693d6ce7f59eb51ed10"} Mar 17 01:44:50 crc kubenswrapper[4735]: I0317 01:44:50.054824 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" podStartSLOduration=1.374778171 podStartE2EDuration="2.054807172s" podCreationTimestamp="2026-03-17 01:44:48 +0000 UTC" firstStartedPulling="2026-03-17 01:44:49.025460725 +0000 UTC m=+2114.657693703" lastFinishedPulling="2026-03-17 01:44:49.705489716 +0000 UTC m=+2115.337722704" observedRunningTime="2026-03-17 01:44:50.045967057 +0000 UTC m=+2115.678200035" watchObservedRunningTime="2026-03-17 01:44:50.054807172 +0000 UTC m=+2115.687040150" Mar 17 01:44:55 crc kubenswrapper[4735]: I0317 01:44:55.037012 4735 scope.go:117] "RemoveContainer" containerID="c3ed463722cf029fd37850d25d47594dc9c642df9a6a56a344cd8213261c2549" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.164195 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb"] Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.165775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.169249 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.169569 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.182261 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb"] Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.282050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7psv4\" (UniqueName: \"kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.282187 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.282232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.384205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7psv4\" (UniqueName: \"kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.384841 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.384988 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.385709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.394546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.407703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7psv4\" (UniqueName: \"kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4\") pod \"collect-profiles-29561865-dl9qb\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.493767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:00 crc kubenswrapper[4735]: I0317 01:45:00.931302 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb"] Mar 17 01:45:01 crc kubenswrapper[4735]: I0317 01:45:01.146430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" event={"ID":"1aa06a89-796f-4cea-b11b-c75a723deda6","Type":"ContainerStarted","Data":"c3fabb75cbc491ae3d59da407b364de2729f7682a16d7f7abffa5605b1879922"} Mar 17 01:45:02 crc kubenswrapper[4735]: I0317 01:45:02.157714 4735 generic.go:334] "Generic (PLEG): container finished" podID="1aa06a89-796f-4cea-b11b-c75a723deda6" containerID="e0132bf9d26399a9b096e4fcb665d5ec0c6d47b9a3cc0683301ef945b2af8b4e" exitCode=0 Mar 17 01:45:02 crc kubenswrapper[4735]: I0317 01:45:02.157765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" event={"ID":"1aa06a89-796f-4cea-b11b-c75a723deda6","Type":"ContainerDied","Data":"e0132bf9d26399a9b096e4fcb665d5ec0c6d47b9a3cc0683301ef945b2af8b4e"} Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.549492 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.649131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume\") pod \"1aa06a89-796f-4cea-b11b-c75a723deda6\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.649467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7psv4\" (UniqueName: \"kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4\") pod \"1aa06a89-796f-4cea-b11b-c75a723deda6\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.649655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume\") pod \"1aa06a89-796f-4cea-b11b-c75a723deda6\" (UID: \"1aa06a89-796f-4cea-b11b-c75a723deda6\") " Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.650851 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1aa06a89-796f-4cea-b11b-c75a723deda6" (UID: "1aa06a89-796f-4cea-b11b-c75a723deda6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.655032 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1aa06a89-796f-4cea-b11b-c75a723deda6" (UID: "1aa06a89-796f-4cea-b11b-c75a723deda6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.657164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4" (OuterVolumeSpecName: "kube-api-access-7psv4") pod "1aa06a89-796f-4cea-b11b-c75a723deda6" (UID: "1aa06a89-796f-4cea-b11b-c75a723deda6"). InnerVolumeSpecName "kube-api-access-7psv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.752140 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa06a89-796f-4cea-b11b-c75a723deda6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.752180 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7psv4\" (UniqueName: \"kubernetes.io/projected/1aa06a89-796f-4cea-b11b-c75a723deda6-kube-api-access-7psv4\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:03 crc kubenswrapper[4735]: I0317 01:45:03.752195 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa06a89-796f-4cea-b11b-c75a723deda6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:04 crc kubenswrapper[4735]: I0317 01:45:04.194417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" event={"ID":"1aa06a89-796f-4cea-b11b-c75a723deda6","Type":"ContainerDied","Data":"c3fabb75cbc491ae3d59da407b364de2729f7682a16d7f7abffa5605b1879922"} Mar 17 01:45:04 crc kubenswrapper[4735]: I0317 01:45:04.194454 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb" Mar 17 01:45:04 crc kubenswrapper[4735]: I0317 01:45:04.194477 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3fabb75cbc491ae3d59da407b364de2729f7682a16d7f7abffa5605b1879922" Mar 17 01:45:04 crc kubenswrapper[4735]: I0317 01:45:04.616875 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78"] Mar 17 01:45:04 crc kubenswrapper[4735]: I0317 01:45:04.626053 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-2vw78"] Mar 17 01:45:05 crc kubenswrapper[4735]: I0317 01:45:05.086982 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fe26de-7115-40ae-a335-8feafa74fb68" path="/var/lib/kubelet/pods/a3fe26de-7115-40ae-a335-8feafa74fb68/volumes" Mar 17 01:45:55 crc kubenswrapper[4735]: I0317 01:45:55.125193 4735 scope.go:117] "RemoveContainer" containerID="aa799d9c8cf4e6466b4829b66470e10bc4def1e5479adcf9f4abc707b6f2c647" Mar 17 01:45:59 crc kubenswrapper[4735]: I0317 01:45:59.698589 4735 generic.go:334] "Generic (PLEG): container finished" podID="039a6699-7fc5-48c1-89b0-0e3946f69349" containerID="b3b1c80e20e62fcb2d62391aa2e590baaa695267539f3798ad10ffccdaadff29" exitCode=0 Mar 17 01:45:59 crc kubenswrapper[4735]: I0317 01:45:59.698675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" event={"ID":"039a6699-7fc5-48c1-89b0-0e3946f69349","Type":"ContainerDied","Data":"b3b1c80e20e62fcb2d62391aa2e590baaa695267539f3798ad10ffccdaadff29"} Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.143173 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561866-85mpv"] Mar 17 01:46:00 crc kubenswrapper[4735]: E0317 01:46:00.143757 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa06a89-796f-4cea-b11b-c75a723deda6" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.143772 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa06a89-796f-4cea-b11b-c75a723deda6" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.143978 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa06a89-796f-4cea-b11b-c75a723deda6" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.144556 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.150814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.152061 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.152263 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-85mpv"] Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.155073 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.300506 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfq4x\" (UniqueName: \"kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x\") pod \"auto-csr-approver-29561866-85mpv\" (UID: \"697cb47c-75db-4d5e-a348-23fd7b455b06\") " pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.403087 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfq4x\" (UniqueName: \"kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x\") pod \"auto-csr-approver-29561866-85mpv\" (UID: \"697cb47c-75db-4d5e-a348-23fd7b455b06\") " pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.428590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfq4x\" (UniqueName: \"kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x\") pod \"auto-csr-approver-29561866-85mpv\" (UID: \"697cb47c-75db-4d5e-a348-23fd7b455b06\") " pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.459338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:00 crc kubenswrapper[4735]: I0317 01:46:00.970921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-85mpv"] Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.146944 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.318831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory\") pod \"039a6699-7fc5-48c1-89b0-0e3946f69349\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.318988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle\") pod \"039a6699-7fc5-48c1-89b0-0e3946f69349\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.319079 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2gt\" (UniqueName: \"kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt\") pod \"039a6699-7fc5-48c1-89b0-0e3946f69349\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.319201 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam\") pod \"039a6699-7fc5-48c1-89b0-0e3946f69349\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.319238 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0\") pod \"039a6699-7fc5-48c1-89b0-0e3946f69349\" (UID: \"039a6699-7fc5-48c1-89b0-0e3946f69349\") " Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.325080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "039a6699-7fc5-48c1-89b0-0e3946f69349" (UID: "039a6699-7fc5-48c1-89b0-0e3946f69349"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.325182 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt" (OuterVolumeSpecName: "kube-api-access-sn2gt") pod "039a6699-7fc5-48c1-89b0-0e3946f69349" (UID: "039a6699-7fc5-48c1-89b0-0e3946f69349"). InnerVolumeSpecName "kube-api-access-sn2gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.348229 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory" (OuterVolumeSpecName: "inventory") pod "039a6699-7fc5-48c1-89b0-0e3946f69349" (UID: "039a6699-7fc5-48c1-89b0-0e3946f69349"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.348552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "039a6699-7fc5-48c1-89b0-0e3946f69349" (UID: "039a6699-7fc5-48c1-89b0-0e3946f69349"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.352547 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "039a6699-7fc5-48c1-89b0-0e3946f69349" (UID: "039a6699-7fc5-48c1-89b0-0e3946f69349"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.423134 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.423348 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.423407 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2gt\" (UniqueName: \"kubernetes.io/projected/039a6699-7fc5-48c1-89b0-0e3946f69349-kube-api-access-sn2gt\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.423461 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/039a6699-7fc5-48c1-89b0-0e3946f69349-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.423629 4735 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/039a6699-7fc5-48c1-89b0-0e3946f69349-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.722202 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-85mpv" event={"ID":"697cb47c-75db-4d5e-a348-23fd7b455b06","Type":"ContainerStarted","Data":"f0e685250378f767b82c75ca5e6046d36145a2413e0a32c0d8b142755e9fd17a"} Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.725075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" event={"ID":"039a6699-7fc5-48c1-89b0-0e3946f69349","Type":"ContainerDied","Data":"0786442e08490fa2f40b60cc51a13e07d1797d92887e2693d6ce7f59eb51ed10"} Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.725105 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0786442e08490fa2f40b60cc51a13e07d1797d92887e2693d6ce7f59eb51ed10" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.725163 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pwsgn" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.862778 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r"] Mar 17 01:46:01 crc kubenswrapper[4735]: E0317 01:46:01.863552 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039a6699-7fc5-48c1-89b0-0e3946f69349" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.863573 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="039a6699-7fc5-48c1-89b0-0e3946f69349" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.863801 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="039a6699-7fc5-48c1-89b0-0e3946f69349" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.864558 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.867104 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.867157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.869238 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.869655 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.871606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.871840 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:46:01 crc kubenswrapper[4735]: I0317 01:46:01.886422 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r"] Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.035523 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.037091 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkh4\" (UniqueName: \"kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.037254 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.037534 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.037674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.038351 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkh4\" (UniqueName: \"kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.143808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.153068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.153673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.155181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.162890 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.162989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.173770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkh4\" (UniqueName: \"kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.183601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.736585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-85mpv" event={"ID":"697cb47c-75db-4d5e-a348-23fd7b455b06","Type":"ContainerStarted","Data":"a7a5ad6a80ead65cdbd8601b4c4b351dee2206b4c56ef38425df744d4a3a51e7"} Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.758189 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561866-85mpv" podStartSLOduration=1.641674799 podStartE2EDuration="2.758173602s" podCreationTimestamp="2026-03-17 01:46:00 +0000 UTC" firstStartedPulling="2026-03-17 01:46:01.004213735 +0000 UTC m=+2186.636446713" lastFinishedPulling="2026-03-17 01:46:02.120712528 +0000 UTC m=+2187.752945516" observedRunningTime="2026-03-17 01:46:02.755303492 +0000 UTC m=+2188.387536460" watchObservedRunningTime="2026-03-17 01:46:02.758173602 +0000 UTC m=+2188.390406580" Mar 17 01:46:02 crc kubenswrapper[4735]: I0317 01:46:02.929195 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r"] Mar 17 01:46:03 crc kubenswrapper[4735]: I0317 01:46:03.747345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" event={"ID":"3a17772e-46c6-4b61-8fd1-0626f2097355","Type":"ContainerStarted","Data":"39d45304d8746a9338d8372bf11a8b6706d480615910835647ea4aebbfb67aff"} Mar 17 01:46:03 crc kubenswrapper[4735]: I0317 01:46:03.749611 4735 generic.go:334] "Generic (PLEG): container finished" podID="697cb47c-75db-4d5e-a348-23fd7b455b06" containerID="a7a5ad6a80ead65cdbd8601b4c4b351dee2206b4c56ef38425df744d4a3a51e7" exitCode=0 Mar 17 01:46:03 crc kubenswrapper[4735]: I0317 01:46:03.749642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-85mpv" event={"ID":"697cb47c-75db-4d5e-a348-23fd7b455b06","Type":"ContainerDied","Data":"a7a5ad6a80ead65cdbd8601b4c4b351dee2206b4c56ef38425df744d4a3a51e7"} Mar 17 01:46:04 crc kubenswrapper[4735]: I0317 01:46:04.777390 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" event={"ID":"3a17772e-46c6-4b61-8fd1-0626f2097355","Type":"ContainerStarted","Data":"410f16bb99f34e399ba00ff7a32e18c50fe1ad9157e2a2ddfe67e88a23d66c72"} Mar 17 01:46:04 crc kubenswrapper[4735]: I0317 01:46:04.842216 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" podStartSLOduration=3.276854063 podStartE2EDuration="3.842193898s" podCreationTimestamp="2026-03-17 01:46:01 +0000 UTC" firstStartedPulling="2026-03-17 01:46:02.943810199 +0000 UTC m=+2188.576043177" lastFinishedPulling="2026-03-17 01:46:03.509150024 +0000 UTC m=+2189.141383012" observedRunningTime="2026-03-17 01:46:04.834554502 +0000 UTC m=+2190.466787500" watchObservedRunningTime="2026-03-17 01:46:04.842193898 +0000 UTC m=+2190.474426886" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.157665 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.209468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfq4x\" (UniqueName: \"kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x\") pod \"697cb47c-75db-4d5e-a348-23fd7b455b06\" (UID: \"697cb47c-75db-4d5e-a348-23fd7b455b06\") " Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.228055 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x" (OuterVolumeSpecName: "kube-api-access-rfq4x") pod "697cb47c-75db-4d5e-a348-23fd7b455b06" (UID: "697cb47c-75db-4d5e-a348-23fd7b455b06"). InnerVolumeSpecName "kube-api-access-rfq4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.312339 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfq4x\" (UniqueName: \"kubernetes.io/projected/697cb47c-75db-4d5e-a348-23fd7b455b06-kube-api-access-rfq4x\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.796202 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-85mpv" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.798140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-85mpv" event={"ID":"697cb47c-75db-4d5e-a348-23fd7b455b06","Type":"ContainerDied","Data":"f0e685250378f767b82c75ca5e6046d36145a2413e0a32c0d8b142755e9fd17a"} Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.798218 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0e685250378f767b82c75ca5e6046d36145a2413e0a32c0d8b142755e9fd17a" Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.854572 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-t4kxz"] Mar 17 01:46:05 crc kubenswrapper[4735]: I0317 01:46:05.863356 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-t4kxz"] Mar 17 01:46:07 crc kubenswrapper[4735]: I0317 01:46:07.091743 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b21370-ca31-43b2-a5fb-78409f181a24" path="/var/lib/kubelet/pods/b9b21370-ca31-43b2-a5fb-78409f181a24/volumes" Mar 17 01:46:42 crc kubenswrapper[4735]: I0317 01:46:42.606337 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:46:42 crc kubenswrapper[4735]: I0317 01:46:42.608027 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:46:55 crc kubenswrapper[4735]: I0317 01:46:55.197947 4735 scope.go:117] "RemoveContainer" containerID="435841b462148c1e0951eacc109db40a377200217e3bab76ad5f12b55f0c1a2f" Mar 17 01:46:58 crc kubenswrapper[4735]: I0317 01:46:58.366097 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a17772e-46c6-4b61-8fd1-0626f2097355" containerID="410f16bb99f34e399ba00ff7a32e18c50fe1ad9157e2a2ddfe67e88a23d66c72" exitCode=0 Mar 17 01:46:58 crc kubenswrapper[4735]: I0317 01:46:58.366292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" event={"ID":"3a17772e-46c6-4b61-8fd1-0626f2097355","Type":"ContainerDied","Data":"410f16bb99f34e399ba00ff7a32e18c50fe1ad9157e2a2ddfe67e88a23d66c72"} Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.850375 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfkh4\" (UniqueName: \"kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953498 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953522 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953620 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.953647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam\") pod \"3a17772e-46c6-4b61-8fd1-0626f2097355\" (UID: \"3a17772e-46c6-4b61-8fd1-0626f2097355\") " Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.958826 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.966403 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4" (OuterVolumeSpecName: "kube-api-access-qfkh4") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "kube-api-access-qfkh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.980487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.987966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.991333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:46:59 crc kubenswrapper[4735]: I0317 01:46:59.995181 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory" (OuterVolumeSpecName: "inventory") pod "3a17772e-46c6-4b61-8fd1-0626f2097355" (UID: "3a17772e-46c6-4b61-8fd1-0626f2097355"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.054999 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.055037 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.055053 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.055070 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfkh4\" (UniqueName: \"kubernetes.io/projected/3a17772e-46c6-4b61-8fd1-0626f2097355-kube-api-access-qfkh4\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.055084 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.055098 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a17772e-46c6-4b61-8fd1-0626f2097355-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.384718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" event={"ID":"3a17772e-46c6-4b61-8fd1-0626f2097355","Type":"ContainerDied","Data":"39d45304d8746a9338d8372bf11a8b6706d480615910835647ea4aebbfb67aff"} Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.384770 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39d45304d8746a9338d8372bf11a8b6706d480615910835647ea4aebbfb67aff" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.385425 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.503747 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz"] Mar 17 01:47:00 crc kubenswrapper[4735]: E0317 01:47:00.504144 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697cb47c-75db-4d5e-a348-23fd7b455b06" containerName="oc" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.504160 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="697cb47c-75db-4d5e-a348-23fd7b455b06" containerName="oc" Mar 17 01:47:00 crc kubenswrapper[4735]: E0317 01:47:00.504175 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a17772e-46c6-4b61-8fd1-0626f2097355" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.504183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a17772e-46c6-4b61-8fd1-0626f2097355" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.504345 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a17772e-46c6-4b61-8fd1-0626f2097355" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.504367 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="697cb47c-75db-4d5e-a348-23fd7b455b06" containerName="oc" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.504939 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.507038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.507134 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.509889 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.510180 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.510408 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.517677 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz"] Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.566776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.566881 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6nd\" (UniqueName: \"kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.566961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.567072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.567174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.669052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.669929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6nd\" (UniqueName: \"kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.670015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.670061 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.670108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.681371 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.682690 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.683418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.695354 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.699561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6nd\" (UniqueName: \"kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vswmz\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:00 crc kubenswrapper[4735]: I0317 01:47:00.826321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:47:01 crc kubenswrapper[4735]: I0317 01:47:01.444011 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz"] Mar 17 01:47:01 crc kubenswrapper[4735]: I0317 01:47:01.451553 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:47:02 crc kubenswrapper[4735]: I0317 01:47:02.399960 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" event={"ID":"7354d451-8e24-4c65-855c-e6a33c66d134","Type":"ContainerStarted","Data":"b879983169f6e81e47d42fd66f6b22638a77f5866a9a10fe308fe51064fab9ad"} Mar 17 01:47:02 crc kubenswrapper[4735]: I0317 01:47:02.400205 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" event={"ID":"7354d451-8e24-4c65-855c-e6a33c66d134","Type":"ContainerStarted","Data":"2ccadc829269cfac786292b075966236f720b24d91b00e13cbd15d33f63a4259"} Mar 17 01:47:02 crc kubenswrapper[4735]: I0317 01:47:02.453841 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" podStartSLOduration=2.025382644 podStartE2EDuration="2.453826491s" podCreationTimestamp="2026-03-17 01:47:00 +0000 UTC" firstStartedPulling="2026-03-17 01:47:01.451326816 +0000 UTC m=+2247.083559794" lastFinishedPulling="2026-03-17 01:47:01.879770653 +0000 UTC m=+2247.512003641" observedRunningTime="2026-03-17 01:47:02.451695618 +0000 UTC m=+2248.083928596" watchObservedRunningTime="2026-03-17 01:47:02.453826491 +0000 UTC m=+2248.086059469" Mar 17 01:47:03 crc kubenswrapper[4735]: I0317 01:47:03.920660 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:03 crc kubenswrapper[4735]: I0317 01:47:03.930380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:03 crc kubenswrapper[4735]: I0317 01:47:03.937047 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.044062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.044378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.044671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdm2\" (UniqueName: \"kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.146385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdm2\" (UniqueName: \"kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.146484 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.146573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.147080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.147357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.165893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdm2\" (UniqueName: \"kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2\") pod \"redhat-marketplace-b6lbr\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.253760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:04 crc kubenswrapper[4735]: I0317 01:47:04.793194 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:04 crc kubenswrapper[4735]: W0317 01:47:04.800116 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55372d96_d312_47ee_b94d_1c3b0168a0cd.slice/crio-4ebd6602876fbe9a89bb237d6371da480dc290449ae8061c1936116dc08b8e91 WatchSource:0}: Error finding container 4ebd6602876fbe9a89bb237d6371da480dc290449ae8061c1936116dc08b8e91: Status 404 returned error can't find the container with id 4ebd6602876fbe9a89bb237d6371da480dc290449ae8061c1936116dc08b8e91 Mar 17 01:47:05 crc kubenswrapper[4735]: I0317 01:47:05.438092 4735 generic.go:334] "Generic (PLEG): container finished" podID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerID="25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b" exitCode=0 Mar 17 01:47:05 crc kubenswrapper[4735]: I0317 01:47:05.438158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerDied","Data":"25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b"} Mar 17 01:47:05 crc kubenswrapper[4735]: I0317 01:47:05.439004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerStarted","Data":"4ebd6602876fbe9a89bb237d6371da480dc290449ae8061c1936116dc08b8e91"} Mar 17 01:47:06 crc kubenswrapper[4735]: I0317 01:47:06.451262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerStarted","Data":"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e"} Mar 17 01:47:08 crc kubenswrapper[4735]: I0317 01:47:08.471510 4735 generic.go:334] "Generic (PLEG): container finished" podID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerID="98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e" exitCode=0 Mar 17 01:47:08 crc kubenswrapper[4735]: I0317 01:47:08.471567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerDied","Data":"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e"} Mar 17 01:47:09 crc kubenswrapper[4735]: I0317 01:47:09.483114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerStarted","Data":"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917"} Mar 17 01:47:09 crc kubenswrapper[4735]: I0317 01:47:09.504907 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b6lbr" podStartSLOduration=3.040562538 podStartE2EDuration="6.504887871s" podCreationTimestamp="2026-03-17 01:47:03 +0000 UTC" firstStartedPulling="2026-03-17 01:47:05.442099675 +0000 UTC m=+2251.074332663" lastFinishedPulling="2026-03-17 01:47:08.906425028 +0000 UTC m=+2254.538657996" observedRunningTime="2026-03-17 01:47:09.498046724 +0000 UTC m=+2255.130279712" watchObservedRunningTime="2026-03-17 01:47:09.504887871 +0000 UTC m=+2255.137120849" Mar 17 01:47:12 crc kubenswrapper[4735]: I0317 01:47:12.606139 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:47:12 crc kubenswrapper[4735]: I0317 01:47:12.606704 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:47:14 crc kubenswrapper[4735]: I0317 01:47:14.254844 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:14 crc kubenswrapper[4735]: I0317 01:47:14.255324 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:15 crc kubenswrapper[4735]: I0317 01:47:15.296115 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-b6lbr" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="registry-server" probeResult="failure" output=< Mar 17 01:47:15 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:47:15 crc kubenswrapper[4735]: > Mar 17 01:47:24 crc kubenswrapper[4735]: I0317 01:47:24.329105 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:24 crc kubenswrapper[4735]: I0317 01:47:24.394066 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:24 crc kubenswrapper[4735]: I0317 01:47:24.582244 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:25 crc kubenswrapper[4735]: I0317 01:47:25.617649 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b6lbr" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="registry-server" containerID="cri-o://4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917" gracePeriod=2 Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.128942 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.265566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities\") pod \"55372d96-d312-47ee-b94d-1c3b0168a0cd\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.265670 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvdm2\" (UniqueName: \"kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2\") pod \"55372d96-d312-47ee-b94d-1c3b0168a0cd\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.265767 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content\") pod \"55372d96-d312-47ee-b94d-1c3b0168a0cd\" (UID: \"55372d96-d312-47ee-b94d-1c3b0168a0cd\") " Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.266223 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities" (OuterVolumeSpecName: "utilities") pod "55372d96-d312-47ee-b94d-1c3b0168a0cd" (UID: "55372d96-d312-47ee-b94d-1c3b0168a0cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.298073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2" (OuterVolumeSpecName: "kube-api-access-zvdm2") pod "55372d96-d312-47ee-b94d-1c3b0168a0cd" (UID: "55372d96-d312-47ee-b94d-1c3b0168a0cd"). InnerVolumeSpecName "kube-api-access-zvdm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.330549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55372d96-d312-47ee-b94d-1c3b0168a0cd" (UID: "55372d96-d312-47ee-b94d-1c3b0168a0cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.371634 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.371671 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvdm2\" (UniqueName: \"kubernetes.io/projected/55372d96-d312-47ee-b94d-1c3b0168a0cd-kube-api-access-zvdm2\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.371681 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55372d96-d312-47ee-b94d-1c3b0168a0cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.626764 4735 generic.go:334] "Generic (PLEG): container finished" podID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerID="4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917" exitCode=0 Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.626805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerDied","Data":"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917"} Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.626831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6lbr" event={"ID":"55372d96-d312-47ee-b94d-1c3b0168a0cd","Type":"ContainerDied","Data":"4ebd6602876fbe9a89bb237d6371da480dc290449ae8061c1936116dc08b8e91"} Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.626834 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6lbr" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.626847 4735 scope.go:117] "RemoveContainer" containerID="4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.644826 4735 scope.go:117] "RemoveContainer" containerID="98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.660344 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.668952 4735 scope.go:117] "RemoveContainer" containerID="25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.671787 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6lbr"] Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.727852 4735 scope.go:117] "RemoveContainer" containerID="4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917" Mar 17 01:47:26 crc kubenswrapper[4735]: E0317 01:47:26.728685 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917\": container with ID starting with 4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917 not found: ID does not exist" containerID="4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.728734 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917"} err="failed to get container status \"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917\": rpc error: code = NotFound desc = could not find container \"4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917\": container with ID starting with 4f78d00b1cc1cc9944e561f21044c876c876187a669a75cefdad116ba50b9917 not found: ID does not exist" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.728761 4735 scope.go:117] "RemoveContainer" containerID="98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e" Mar 17 01:47:26 crc kubenswrapper[4735]: E0317 01:47:26.729268 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e\": container with ID starting with 98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e not found: ID does not exist" containerID="98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.729298 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e"} err="failed to get container status \"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e\": rpc error: code = NotFound desc = could not find container \"98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e\": container with ID starting with 98ebfa508c6c84021bf1a3c395e492432514378b0af667ecc1ab0e90a74b0c1e not found: ID does not exist" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.729320 4735 scope.go:117] "RemoveContainer" containerID="25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b" Mar 17 01:47:26 crc kubenswrapper[4735]: E0317 01:47:26.729531 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b\": container with ID starting with 25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b not found: ID does not exist" containerID="25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b" Mar 17 01:47:26 crc kubenswrapper[4735]: I0317 01:47:26.729551 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b"} err="failed to get container status \"25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b\": rpc error: code = NotFound desc = could not find container \"25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b\": container with ID starting with 25fdde7a73411d89784e1f5abe331d1fe21277a980e4686bb21d988cd038427b not found: ID does not exist" Mar 17 01:47:27 crc kubenswrapper[4735]: I0317 01:47:27.086652 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" path="/var/lib/kubelet/pods/55372d96-d312-47ee-b94d-1c3b0168a0cd/volumes" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.606425 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.607232 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.607295 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.608537 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.608659 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" gracePeriod=600 Mar 17 01:47:42 crc kubenswrapper[4735]: E0317 01:47:42.797723 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.835511 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" exitCode=0 Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.835578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb"} Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.835699 4735 scope.go:117] "RemoveContainer" containerID="599b0f1aa110d3294f873b6200db87c7a4f4c9bfd5e3739ccd5f9a116ac7a76c" Mar 17 01:47:42 crc kubenswrapper[4735]: I0317 01:47:42.836728 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:47:42 crc kubenswrapper[4735]: E0317 01:47:42.837257 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:47:56 crc kubenswrapper[4735]: I0317 01:47:56.075137 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:47:56 crc kubenswrapper[4735]: E0317 01:47:56.076326 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.190393 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561868-d8m6x"] Mar 17 01:48:00 crc kubenswrapper[4735]: E0317 01:48:00.193917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="extract-utilities" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.193952 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="extract-utilities" Mar 17 01:48:00 crc kubenswrapper[4735]: E0317 01:48:00.194202 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.194224 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4735]: E0317 01:48:00.194305 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="extract-content" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.194322 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="extract-content" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.196243 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="55372d96-d312-47ee-b94d-1c3b0168a0cd" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.198019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.202072 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.202620 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.202636 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.210239 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-d8m6x"] Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.325478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpdc\" (UniqueName: \"kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc\") pod \"auto-csr-approver-29561868-d8m6x\" (UID: \"0dc36276-ff9c-47ff-b33f-42d47c854281\") " pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.429714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpdc\" (UniqueName: \"kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc\") pod \"auto-csr-approver-29561868-d8m6x\" (UID: \"0dc36276-ff9c-47ff-b33f-42d47c854281\") " pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.453994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpdc\" (UniqueName: \"kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc\") pod \"auto-csr-approver-29561868-d8m6x\" (UID: \"0dc36276-ff9c-47ff-b33f-42d47c854281\") " pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.542462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:00 crc kubenswrapper[4735]: I0317 01:48:00.851672 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-d8m6x"] Mar 17 01:48:01 crc kubenswrapper[4735]: I0317 01:48:01.065429 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" event={"ID":"0dc36276-ff9c-47ff-b33f-42d47c854281","Type":"ContainerStarted","Data":"c8c6616a2b773c9e9f7f007bb712405732fcdc938c06b167a9617f4d25a6cb1d"} Mar 17 01:48:03 crc kubenswrapper[4735]: I0317 01:48:03.090655 4735 generic.go:334] "Generic (PLEG): container finished" podID="0dc36276-ff9c-47ff-b33f-42d47c854281" containerID="65632f82533e9d73c42e3c52a275a1e91c14dc2adee7801d431ee32028e940c0" exitCode=0 Mar 17 01:48:03 crc kubenswrapper[4735]: I0317 01:48:03.096150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" event={"ID":"0dc36276-ff9c-47ff-b33f-42d47c854281","Type":"ContainerDied","Data":"65632f82533e9d73c42e3c52a275a1e91c14dc2adee7801d431ee32028e940c0"} Mar 17 01:48:04 crc kubenswrapper[4735]: I0317 01:48:04.507561 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:04 crc kubenswrapper[4735]: I0317 01:48:04.651968 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpdc\" (UniqueName: \"kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc\") pod \"0dc36276-ff9c-47ff-b33f-42d47c854281\" (UID: \"0dc36276-ff9c-47ff-b33f-42d47c854281\") " Mar 17 01:48:04 crc kubenswrapper[4735]: I0317 01:48:04.660812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc" (OuterVolumeSpecName: "kube-api-access-cnpdc") pod "0dc36276-ff9c-47ff-b33f-42d47c854281" (UID: "0dc36276-ff9c-47ff-b33f-42d47c854281"). InnerVolumeSpecName "kube-api-access-cnpdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:48:04 crc kubenswrapper[4735]: I0317 01:48:04.755143 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpdc\" (UniqueName: \"kubernetes.io/projected/0dc36276-ff9c-47ff-b33f-42d47c854281-kube-api-access-cnpdc\") on node \"crc\" DevicePath \"\"" Mar 17 01:48:05 crc kubenswrapper[4735]: I0317 01:48:05.113600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" event={"ID":"0dc36276-ff9c-47ff-b33f-42d47c854281","Type":"ContainerDied","Data":"c8c6616a2b773c9e9f7f007bb712405732fcdc938c06b167a9617f4d25a6cb1d"} Mar 17 01:48:05 crc kubenswrapper[4735]: I0317 01:48:05.113638 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c6616a2b773c9e9f7f007bb712405732fcdc938c06b167a9617f4d25a6cb1d" Mar 17 01:48:05 crc kubenswrapper[4735]: I0317 01:48:05.113696 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-d8m6x" Mar 17 01:48:05 crc kubenswrapper[4735]: I0317 01:48:05.607685 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-6n6th"] Mar 17 01:48:05 crc kubenswrapper[4735]: I0317 01:48:05.617407 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-6n6th"] Mar 17 01:48:07 crc kubenswrapper[4735]: I0317 01:48:07.074215 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:48:07 crc kubenswrapper[4735]: E0317 01:48:07.075051 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:48:07 crc kubenswrapper[4735]: I0317 01:48:07.092450 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fe46ef-0187-4f2e-9afb-b5885cba3dc7" path="/var/lib/kubelet/pods/89fe46ef-0187-4f2e-9afb-b5885cba3dc7/volumes" Mar 17 01:48:22 crc kubenswrapper[4735]: I0317 01:48:22.073396 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:48:22 crc kubenswrapper[4735]: E0317 01:48:22.074523 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:48:35 crc kubenswrapper[4735]: I0317 01:48:35.079930 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:48:35 crc kubenswrapper[4735]: E0317 01:48:35.080648 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:48:46 crc kubenswrapper[4735]: I0317 01:48:46.073207 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:48:46 crc kubenswrapper[4735]: E0317 01:48:46.073941 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:48:55 crc kubenswrapper[4735]: I0317 01:48:55.334742 4735 scope.go:117] "RemoveContainer" containerID="89ff04345e4329c4b9c351e94b091d7fee63667d255f8a44356b7822b53548ad" Mar 17 01:49:01 crc kubenswrapper[4735]: I0317 01:49:01.074115 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:49:01 crc kubenswrapper[4735]: E0317 01:49:01.074926 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.466080 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kjm9t"] Mar 17 01:49:06 crc kubenswrapper[4735]: E0317 01:49:06.466907 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc36276-ff9c-47ff-b33f-42d47c854281" containerName="oc" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.466920 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc36276-ff9c-47ff-b33f-42d47c854281" containerName="oc" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.467136 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc36276-ff9c-47ff-b33f-42d47c854281" containerName="oc" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.468440 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.519609 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-utilities\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.519718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-catalog-content\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.519766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flpw6\" (UniqueName: \"kubernetes.io/projected/a217fdf0-6e31-4c78-a859-428945d1067e-kube-api-access-flpw6\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.523882 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjm9t"] Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.621726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-catalog-content\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.621939 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flpw6\" (UniqueName: \"kubernetes.io/projected/a217fdf0-6e31-4c78-a859-428945d1067e-kube-api-access-flpw6\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.622043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-utilities\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.622731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-utilities\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.623218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a217fdf0-6e31-4c78-a859-428945d1067e-catalog-content\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.647891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flpw6\" (UniqueName: \"kubernetes.io/projected/a217fdf0-6e31-4c78-a859-428945d1067e-kube-api-access-flpw6\") pod \"certified-operators-kjm9t\" (UID: \"a217fdf0-6e31-4c78-a859-428945d1067e\") " pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:06 crc kubenswrapper[4735]: I0317 01:49:06.835557 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:07 crc kubenswrapper[4735]: I0317 01:49:07.367055 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjm9t"] Mar 17 01:49:07 crc kubenswrapper[4735]: I0317 01:49:07.842085 4735 generic.go:334] "Generic (PLEG): container finished" podID="a217fdf0-6e31-4c78-a859-428945d1067e" containerID="c0bd18a88a197f937039071c04e8de10de438fadd40013fd302c48ef7395d7cc" exitCode=0 Mar 17 01:49:07 crc kubenswrapper[4735]: I0317 01:49:07.842328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjm9t" event={"ID":"a217fdf0-6e31-4c78-a859-428945d1067e","Type":"ContainerDied","Data":"c0bd18a88a197f937039071c04e8de10de438fadd40013fd302c48ef7395d7cc"} Mar 17 01:49:07 crc kubenswrapper[4735]: I0317 01:49:07.842353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjm9t" event={"ID":"a217fdf0-6e31-4c78-a859-428945d1067e","Type":"ContainerStarted","Data":"ecbe65bfd4ec318fca344d337ea9fb182db66c34122eb65f3e952898941216f7"} Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.855611 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.858148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.865330 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.970769 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.970933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv8h6\" (UniqueName: \"kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:08 crc kubenswrapper[4735]: I0317 01:49:08.970955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.045600 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.054090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.059734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.073980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.074061 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv8h6\" (UniqueName: \"kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.074080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.074506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.074732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.113635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv8h6\" (UniqueName: \"kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6\") pod \"redhat-operators-tb2f7\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.175533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.175645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.175723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxqq\" (UniqueName: \"kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.181213 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.277474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxqq\" (UniqueName: \"kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.277802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.277854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.278340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.278362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.298753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxqq\" (UniqueName: \"kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq\") pod \"community-operators-zcz6b\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.396511 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.776704 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:09 crc kubenswrapper[4735]: W0317 01:49:09.797183 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4b62e3_a7ca_4cd3_bc14_2ac46b6b94c8.slice/crio-b7f82585ded6a914a0d09385f86069e65721444e81f23fe33032d75da2d02d3d WatchSource:0}: Error finding container b7f82585ded6a914a0d09385f86069e65721444e81f23fe33032d75da2d02d3d: Status 404 returned error can't find the container with id b7f82585ded6a914a0d09385f86069e65721444e81f23fe33032d75da2d02d3d Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.876030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerStarted","Data":"b7f82585ded6a914a0d09385f86069e65721444e81f23fe33032d75da2d02d3d"} Mar 17 01:49:09 crc kubenswrapper[4735]: I0317 01:49:09.986254 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:10 crc kubenswrapper[4735]: W0317 01:49:10.026383 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f144236_b787_41f7_ba92_cec935402967.slice/crio-efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020 WatchSource:0}: Error finding container efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020: Status 404 returned error can't find the container with id efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020 Mar 17 01:49:10 crc kubenswrapper[4735]: I0317 01:49:10.888429 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerID="3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd" exitCode=0 Mar 17 01:49:10 crc kubenswrapper[4735]: I0317 01:49:10.888615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerDied","Data":"3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd"} Mar 17 01:49:10 crc kubenswrapper[4735]: I0317 01:49:10.891318 4735 generic.go:334] "Generic (PLEG): container finished" podID="4f144236-b787-41f7-ba92-cec935402967" containerID="9fc7332aff6c6e95b32f0d0aadf77961cffa46bebe63b7e964299d5afb14158c" exitCode=0 Mar 17 01:49:10 crc kubenswrapper[4735]: I0317 01:49:10.891374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerDied","Data":"9fc7332aff6c6e95b32f0d0aadf77961cffa46bebe63b7e964299d5afb14158c"} Mar 17 01:49:10 crc kubenswrapper[4735]: I0317 01:49:10.891400 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerStarted","Data":"efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020"} Mar 17 01:49:12 crc kubenswrapper[4735]: I0317 01:49:12.919780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerStarted","Data":"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403"} Mar 17 01:49:12 crc kubenswrapper[4735]: I0317 01:49:12.928395 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerStarted","Data":"ed57edd4cfdef69880e7607d5e347d1e8f5d5dee9468fd0cd162931af53cc5c9"} Mar 17 01:49:14 crc kubenswrapper[4735]: I0317 01:49:14.073447 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:49:14 crc kubenswrapper[4735]: E0317 01:49:14.074045 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:49:14 crc kubenswrapper[4735]: E0317 01:49:14.648847 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f144236_b787_41f7_ba92_cec935402967.slice/crio-ed57edd4cfdef69880e7607d5e347d1e8f5d5dee9468fd0cd162931af53cc5c9.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:49:14 crc kubenswrapper[4735]: I0317 01:49:14.951327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjm9t" event={"ID":"a217fdf0-6e31-4c78-a859-428945d1067e","Type":"ContainerStarted","Data":"c746c41ec34b6aad16b2dd7abe5a30f27bf4b95a41c8b2201bd02347598e62bf"} Mar 17 01:49:14 crc kubenswrapper[4735]: I0317 01:49:14.953536 4735 generic.go:334] "Generic (PLEG): container finished" podID="4f144236-b787-41f7-ba92-cec935402967" containerID="ed57edd4cfdef69880e7607d5e347d1e8f5d5dee9468fd0cd162931af53cc5c9" exitCode=0 Mar 17 01:49:14 crc kubenswrapper[4735]: I0317 01:49:14.953694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerDied","Data":"ed57edd4cfdef69880e7607d5e347d1e8f5d5dee9468fd0cd162931af53cc5c9"} Mar 17 01:49:15 crc kubenswrapper[4735]: I0317 01:49:15.965677 4735 generic.go:334] "Generic (PLEG): container finished" podID="a217fdf0-6e31-4c78-a859-428945d1067e" containerID="c746c41ec34b6aad16b2dd7abe5a30f27bf4b95a41c8b2201bd02347598e62bf" exitCode=0 Mar 17 01:49:15 crc kubenswrapper[4735]: I0317 01:49:15.965778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjm9t" event={"ID":"a217fdf0-6e31-4c78-a859-428945d1067e","Type":"ContainerDied","Data":"c746c41ec34b6aad16b2dd7abe5a30f27bf4b95a41c8b2201bd02347598e62bf"} Mar 17 01:49:16 crc kubenswrapper[4735]: I0317 01:49:16.976945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerStarted","Data":"d1e448a9e1a71d783ef2cf54162ea2a6ca09d53096efab5fe267e1ee207f912b"} Mar 17 01:49:16 crc kubenswrapper[4735]: I0317 01:49:16.979360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjm9t" event={"ID":"a217fdf0-6e31-4c78-a859-428945d1067e","Type":"ContainerStarted","Data":"8dbd6df6dd739ca7adc9e3ce7812f50cab86a704dfe9d9246d529ee730d0cba4"} Mar 17 01:49:16 crc kubenswrapper[4735]: I0317 01:49:16.997947 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zcz6b" podStartSLOduration=2.208039517 podStartE2EDuration="7.997923476s" podCreationTimestamp="2026-03-17 01:49:09 +0000 UTC" firstStartedPulling="2026-03-17 01:49:10.892599516 +0000 UTC m=+2376.524832494" lastFinishedPulling="2026-03-17 01:49:16.682483485 +0000 UTC m=+2382.314716453" observedRunningTime="2026-03-17 01:49:16.993965749 +0000 UTC m=+2382.626198737" watchObservedRunningTime="2026-03-17 01:49:16.997923476 +0000 UTC m=+2382.630156464" Mar 17 01:49:17 crc kubenswrapper[4735]: I0317 01:49:17.017796 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kjm9t" podStartSLOduration=2.460982224 podStartE2EDuration="11.01776854s" podCreationTimestamp="2026-03-17 01:49:06 +0000 UTC" firstStartedPulling="2026-03-17 01:49:07.84360894 +0000 UTC m=+2373.475841918" lastFinishedPulling="2026-03-17 01:49:16.400395246 +0000 UTC m=+2382.032628234" observedRunningTime="2026-03-17 01:49:17.016262233 +0000 UTC m=+2382.648495251" watchObservedRunningTime="2026-03-17 01:49:17.01776854 +0000 UTC m=+2382.650001518" Mar 17 01:49:19 crc kubenswrapper[4735]: I0317 01:49:19.006849 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerID="e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403" exitCode=0 Mar 17 01:49:19 crc kubenswrapper[4735]: I0317 01:49:19.006888 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerDied","Data":"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403"} Mar 17 01:49:19 crc kubenswrapper[4735]: I0317 01:49:19.396815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:19 crc kubenswrapper[4735]: I0317 01:49:19.397168 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:20 crc kubenswrapper[4735]: I0317 01:49:20.019141 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerStarted","Data":"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083"} Mar 17 01:49:20 crc kubenswrapper[4735]: I0317 01:49:20.048379 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tb2f7" podStartSLOduration=3.415199969 podStartE2EDuration="12.048357837s" podCreationTimestamp="2026-03-17 01:49:08 +0000 UTC" firstStartedPulling="2026-03-17 01:49:10.892274348 +0000 UTC m=+2376.524507326" lastFinishedPulling="2026-03-17 01:49:19.525432206 +0000 UTC m=+2385.157665194" observedRunningTime="2026-03-17 01:49:20.043115899 +0000 UTC m=+2385.675348897" watchObservedRunningTime="2026-03-17 01:49:20.048357837 +0000 UTC m=+2385.680590815" Mar 17 01:49:20 crc kubenswrapper[4735]: I0317 01:49:20.462651 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zcz6b" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:20 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:20 crc kubenswrapper[4735]: > Mar 17 01:49:26 crc kubenswrapper[4735]: I0317 01:49:26.836572 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:26 crc kubenswrapper[4735]: I0317 01:49:26.837258 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:27 crc kubenswrapper[4735]: I0317 01:49:27.073364 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:49:27 crc kubenswrapper[4735]: E0317 01:49:27.073965 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:49:27 crc kubenswrapper[4735]: I0317 01:49:27.914445 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kjm9t" podUID="a217fdf0-6e31-4c78-a859-428945d1067e" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:27 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:27 crc kubenswrapper[4735]: > Mar 17 01:49:29 crc kubenswrapper[4735]: I0317 01:49:29.182087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:29 crc kubenswrapper[4735]: I0317 01:49:29.183542 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:30 crc kubenswrapper[4735]: I0317 01:49:30.241747 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tb2f7" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:30 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:30 crc kubenswrapper[4735]: > Mar 17 01:49:30 crc kubenswrapper[4735]: I0317 01:49:30.451958 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zcz6b" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:30 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:30 crc kubenswrapper[4735]: > Mar 17 01:49:36 crc kubenswrapper[4735]: I0317 01:49:36.902805 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:36 crc kubenswrapper[4735]: I0317 01:49:36.954949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kjm9t" Mar 17 01:49:37 crc kubenswrapper[4735]: I0317 01:49:37.483801 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjm9t"] Mar 17 01:49:37 crc kubenswrapper[4735]: I0317 01:49:37.659734 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:49:37 crc kubenswrapper[4735]: I0317 01:49:37.660045 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tfwtc" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="registry-server" containerID="cri-o://4a11a07354712c109887152b4feb62031a0cd1e0f00d4b7e22db85221970d721" gracePeriod=2 Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.075966 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:49:38 crc kubenswrapper[4735]: E0317 01:49:38.076437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.197205 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerID="4a11a07354712c109887152b4feb62031a0cd1e0f00d4b7e22db85221970d721" exitCode=0 Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.198021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerDied","Data":"4a11a07354712c109887152b4feb62031a0cd1e0f00d4b7e22db85221970d721"} Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.198044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfwtc" event={"ID":"8c0d53e6-a114-4a1e-946e-1f0278e9984c","Type":"ContainerDied","Data":"989891560fda3ac0a087793cfdcbf03cf40d64587167a4e3939286bcf8b62dcd"} Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.198054 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="989891560fda3ac0a087793cfdcbf03cf40d64587167a4e3939286bcf8b62dcd" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.238558 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.305726 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dxk\" (UniqueName: \"kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk\") pod \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.305823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities\") pod \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.305999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content\") pod \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\" (UID: \"8c0d53e6-a114-4a1e-946e-1f0278e9984c\") " Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.306911 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities" (OuterVolumeSpecName: "utilities") pod "8c0d53e6-a114-4a1e-946e-1f0278e9984c" (UID: "8c0d53e6-a114-4a1e-946e-1f0278e9984c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.318305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk" (OuterVolumeSpecName: "kube-api-access-24dxk") pod "8c0d53e6-a114-4a1e-946e-1f0278e9984c" (UID: "8c0d53e6-a114-4a1e-946e-1f0278e9984c"). InnerVolumeSpecName "kube-api-access-24dxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.380218 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c0d53e6-a114-4a1e-946e-1f0278e9984c" (UID: "8c0d53e6-a114-4a1e-946e-1f0278e9984c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.408502 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.408532 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24dxk\" (UniqueName: \"kubernetes.io/projected/8c0d53e6-a114-4a1e-946e-1f0278e9984c-kube-api-access-24dxk\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:38 crc kubenswrapper[4735]: I0317 01:49:38.408548 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0d53e6-a114-4a1e-946e-1f0278e9984c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:39 crc kubenswrapper[4735]: I0317 01:49:39.213025 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfwtc" Mar 17 01:49:39 crc kubenswrapper[4735]: I0317 01:49:39.251684 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:49:39 crc kubenswrapper[4735]: I0317 01:49:39.271906 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tfwtc"] Mar 17 01:49:39 crc kubenswrapper[4735]: I0317 01:49:39.448176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:39 crc kubenswrapper[4735]: I0317 01:49:39.525828 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:40 crc kubenswrapper[4735]: I0317 01:49:40.240668 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tb2f7" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:40 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:40 crc kubenswrapper[4735]: > Mar 17 01:49:41 crc kubenswrapper[4735]: I0317 01:49:41.084565 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" path="/var/lib/kubelet/pods/8c0d53e6-a114-4a1e-946e-1f0278e9984c/volumes" Mar 17 01:49:41 crc kubenswrapper[4735]: I0317 01:49:41.865678 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:41 crc kubenswrapper[4735]: I0317 01:49:41.866697 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zcz6b" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" containerID="cri-o://d1e448a9e1a71d783ef2cf54162ea2a6ca09d53096efab5fe267e1ee207f912b" gracePeriod=2 Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.243192 4735 generic.go:334] "Generic (PLEG): container finished" podID="4f144236-b787-41f7-ba92-cec935402967" containerID="d1e448a9e1a71d783ef2cf54162ea2a6ca09d53096efab5fe267e1ee207f912b" exitCode=0 Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.243436 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerDied","Data":"d1e448a9e1a71d783ef2cf54162ea2a6ca09d53096efab5fe267e1ee207f912b"} Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.243460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zcz6b" event={"ID":"4f144236-b787-41f7-ba92-cec935402967","Type":"ContainerDied","Data":"efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020"} Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.243474 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe89c6a167559e4fbb433989e1bff1ce92eb4648ce8ee8b54f4d6d93dd2c020" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.319144 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.390717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities\") pod \"4f144236-b787-41f7-ba92-cec935402967\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.390923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content\") pod \"4f144236-b787-41f7-ba92-cec935402967\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.391073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxqq\" (UniqueName: \"kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq\") pod \"4f144236-b787-41f7-ba92-cec935402967\" (UID: \"4f144236-b787-41f7-ba92-cec935402967\") " Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.391867 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities" (OuterVolumeSpecName: "utilities") pod "4f144236-b787-41f7-ba92-cec935402967" (UID: "4f144236-b787-41f7-ba92-cec935402967"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.392469 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.438130 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq" (OuterVolumeSpecName: "kube-api-access-wnxqq") pod "4f144236-b787-41f7-ba92-cec935402967" (UID: "4f144236-b787-41f7-ba92-cec935402967"). InnerVolumeSpecName "kube-api-access-wnxqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.480619 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f144236-b787-41f7-ba92-cec935402967" (UID: "4f144236-b787-41f7-ba92-cec935402967"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.495065 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f144236-b787-41f7-ba92-cec935402967-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:42 crc kubenswrapper[4735]: I0317 01:49:42.495094 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxqq\" (UniqueName: \"kubernetes.io/projected/4f144236-b787-41f7-ba92-cec935402967-kube-api-access-wnxqq\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:43 crc kubenswrapper[4735]: I0317 01:49:43.257736 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zcz6b" Mar 17 01:49:43 crc kubenswrapper[4735]: I0317 01:49:43.303920 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:43 crc kubenswrapper[4735]: I0317 01:49:43.311252 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zcz6b"] Mar 17 01:49:45 crc kubenswrapper[4735]: I0317 01:49:45.089214 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f144236-b787-41f7-ba92-cec935402967" path="/var/lib/kubelet/pods/4f144236-b787-41f7-ba92-cec935402967/volumes" Mar 17 01:49:49 crc kubenswrapper[4735]: I0317 01:49:49.234368 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:49 crc kubenswrapper[4735]: I0317 01:49:49.295143 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:50 crc kubenswrapper[4735]: I0317 01:49:50.472082 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:50 crc kubenswrapper[4735]: I0317 01:49:50.472278 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tb2f7" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" containerID="cri-o://3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083" gracePeriod=2 Mar 17 01:49:50 crc kubenswrapper[4735]: I0317 01:49:50.962554 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.069615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities\") pod \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.069764 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content\") pod \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.069932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv8h6\" (UniqueName: \"kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6\") pod \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\" (UID: \"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8\") " Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.070473 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities" (OuterVolumeSpecName: "utilities") pod "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" (UID: "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.074868 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6" (OuterVolumeSpecName: "kube-api-access-vv8h6") pod "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" (UID: "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8"). InnerVolumeSpecName "kube-api-access-vv8h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.172425 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.172457 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv8h6\" (UniqueName: \"kubernetes.io/projected/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-kube-api-access-vv8h6\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.231379 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" (UID: "5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.273929 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.330729 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerID="3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083" exitCode=0 Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.330777 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerDied","Data":"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083"} Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.330821 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tb2f7" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.330848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tb2f7" event={"ID":"5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8","Type":"ContainerDied","Data":"b7f82585ded6a914a0d09385f86069e65721444e81f23fe33032d75da2d02d3d"} Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.330880 4735 scope.go:117] "RemoveContainer" containerID="3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.377330 4735 scope.go:117] "RemoveContainer" containerID="e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.377729 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.399937 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tb2f7"] Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.404506 4735 scope.go:117] "RemoveContainer" containerID="3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.447780 4735 scope.go:117] "RemoveContainer" containerID="3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083" Mar 17 01:49:51 crc kubenswrapper[4735]: E0317 01:49:51.448524 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083\": container with ID starting with 3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083 not found: ID does not exist" containerID="3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.448556 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083"} err="failed to get container status \"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083\": rpc error: code = NotFound desc = could not find container \"3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083\": container with ID starting with 3849887b6e98fde9581a5e49d24bdac6eece6d383aca7ec514fa911d1903e083 not found: ID does not exist" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.448582 4735 scope.go:117] "RemoveContainer" containerID="e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403" Mar 17 01:49:51 crc kubenswrapper[4735]: E0317 01:49:51.449010 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403\": container with ID starting with e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403 not found: ID does not exist" containerID="e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.449050 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403"} err="failed to get container status \"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403\": rpc error: code = NotFound desc = could not find container \"e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403\": container with ID starting with e373ac50804c10f53797643b7e025cfe231184b2630ce47527882f91cb2ff403 not found: ID does not exist" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.449075 4735 scope.go:117] "RemoveContainer" containerID="3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd" Mar 17 01:49:51 crc kubenswrapper[4735]: E0317 01:49:51.449385 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd\": container with ID starting with 3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd not found: ID does not exist" containerID="3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd" Mar 17 01:49:51 crc kubenswrapper[4735]: I0317 01:49:51.449412 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd"} err="failed to get container status \"3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd\": rpc error: code = NotFound desc = could not find container \"3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd\": container with ID starting with 3895f40199a36d39adcca0435cbdb9514841a5ad38a3c1339876014a23fec4cd not found: ID does not exist" Mar 17 01:49:52 crc kubenswrapper[4735]: I0317 01:49:52.073095 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:49:52 crc kubenswrapper[4735]: E0317 01:49:52.073573 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:49:53 crc kubenswrapper[4735]: I0317 01:49:53.081763 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" path="/var/lib/kubelet/pods/5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8/volumes" Mar 17 01:49:55 crc kubenswrapper[4735]: I0317 01:49:55.448562 4735 scope.go:117] "RemoveContainer" containerID="16140600b1ad6e6b0f966f760916bd2f1045fce16e5f58a933e57fdfa7818b24" Mar 17 01:49:55 crc kubenswrapper[4735]: I0317 01:49:55.476793 4735 scope.go:117] "RemoveContainer" containerID="b19b2079393359e89db20e4efeb8879bcdce29dac8963251084fe84bf601c558" Mar 17 01:49:55 crc kubenswrapper[4735]: I0317 01:49:55.521234 4735 scope.go:117] "RemoveContainer" containerID="4a11a07354712c109887152b4feb62031a0cd1e0f00d4b7e22db85221970d721" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.161394 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561870-j7g2b"] Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162650 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162674 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162703 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162715 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162731 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162743 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162766 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162777 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162806 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162818 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162848 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162889 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="extract-content" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162907 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162918 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162936 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="extract-utilities" Mar 17 01:50:00 crc kubenswrapper[4735]: E0317 01:50:00.162966 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.162977 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.163331 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4b62e3-a7ca-4cd3-bc14-2ac46b6b94c8" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.163352 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f144236-b787-41f7-ba92-cec935402967" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.163369 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0d53e6-a114-4a1e-946e-1f0278e9984c" containerName="registry-server" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.164522 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.170698 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.170746 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.170761 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.177665 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-j7g2b"] Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.272963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zdd\" (UniqueName: \"kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd\") pod \"auto-csr-approver-29561870-j7g2b\" (UID: \"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0\") " pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.375710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zdd\" (UniqueName: \"kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd\") pod \"auto-csr-approver-29561870-j7g2b\" (UID: \"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0\") " pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.394648 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zdd\" (UniqueName: \"kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd\") pod \"auto-csr-approver-29561870-j7g2b\" (UID: \"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0\") " pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.513990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:00 crc kubenswrapper[4735]: I0317 01:50:00.959419 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-j7g2b"] Mar 17 01:50:01 crc kubenswrapper[4735]: I0317 01:50:01.436432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" event={"ID":"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0","Type":"ContainerStarted","Data":"0cb190ea4b030fdc3968859b51063251ec1926870026596c4de96d5a07cec225"} Mar 17 01:50:03 crc kubenswrapper[4735]: I0317 01:50:03.075194 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:50:03 crc kubenswrapper[4735]: E0317 01:50:03.076003 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:50:03 crc kubenswrapper[4735]: I0317 01:50:03.459631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" event={"ID":"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0","Type":"ContainerStarted","Data":"3bf10ec074e90ee94d758d494c4ef7edca760a3f859794ac487c8bc71bccb628"} Mar 17 01:50:03 crc kubenswrapper[4735]: I0317 01:50:03.487209 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" podStartSLOduration=2.261890336 podStartE2EDuration="3.487162614s" podCreationTimestamp="2026-03-17 01:50:00 +0000 UTC" firstStartedPulling="2026-03-17 01:50:00.966510044 +0000 UTC m=+2426.598743022" lastFinishedPulling="2026-03-17 01:50:02.191782312 +0000 UTC m=+2427.824015300" observedRunningTime="2026-03-17 01:50:03.480649617 +0000 UTC m=+2429.112882635" watchObservedRunningTime="2026-03-17 01:50:03.487162614 +0000 UTC m=+2429.119395602" Mar 17 01:50:04 crc kubenswrapper[4735]: I0317 01:50:04.472242 4735 generic.go:334] "Generic (PLEG): container finished" podID="df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" containerID="3bf10ec074e90ee94d758d494c4ef7edca760a3f859794ac487c8bc71bccb628" exitCode=0 Mar 17 01:50:04 crc kubenswrapper[4735]: I0317 01:50:04.472295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" event={"ID":"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0","Type":"ContainerDied","Data":"3bf10ec074e90ee94d758d494c4ef7edca760a3f859794ac487c8bc71bccb628"} Mar 17 01:50:05 crc kubenswrapper[4735]: I0317 01:50:05.851357 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:05 crc kubenswrapper[4735]: I0317 01:50:05.898628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zdd\" (UniqueName: \"kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd\") pod \"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0\" (UID: \"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0\") " Mar 17 01:50:05 crc kubenswrapper[4735]: I0317 01:50:05.905131 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd" (OuterVolumeSpecName: "kube-api-access-z9zdd") pod "df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" (UID: "df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0"). InnerVolumeSpecName "kube-api-access-z9zdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.000616 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zdd\" (UniqueName: \"kubernetes.io/projected/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0-kube-api-access-z9zdd\") on node \"crc\" DevicePath \"\"" Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.493292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" event={"ID":"df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0","Type":"ContainerDied","Data":"0cb190ea4b030fdc3968859b51063251ec1926870026596c4de96d5a07cec225"} Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.493345 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb190ea4b030fdc3968859b51063251ec1926870026596c4de96d5a07cec225" Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.493348 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-j7g2b" Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.554534 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-cjrqv"] Mar 17 01:50:06 crc kubenswrapper[4735]: I0317 01:50:06.563221 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-cjrqv"] Mar 17 01:50:07 crc kubenswrapper[4735]: I0317 01:50:07.083533 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8" path="/var/lib/kubelet/pods/bfcaf45f-efd5-4bb0-bd0c-a4355144c8c8/volumes" Mar 17 01:50:15 crc kubenswrapper[4735]: I0317 01:50:15.081136 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:50:15 crc kubenswrapper[4735]: E0317 01:50:15.084494 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:50:27 crc kubenswrapper[4735]: I0317 01:50:27.076461 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:50:27 crc kubenswrapper[4735]: E0317 01:50:27.077346 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:50:42 crc kubenswrapper[4735]: I0317 01:50:42.073672 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:50:42 crc kubenswrapper[4735]: E0317 01:50:42.074769 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:50:55 crc kubenswrapper[4735]: I0317 01:50:55.603205 4735 scope.go:117] "RemoveContainer" containerID="4ce4033d72149f893dbe7947805d5f254b63d96c0b5c328b882a97b771257c82" Mar 17 01:50:57 crc kubenswrapper[4735]: I0317 01:50:57.072954 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:50:57 crc kubenswrapper[4735]: E0317 01:50:57.073544 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:51:05 crc kubenswrapper[4735]: I0317 01:51:05.126647 4735 generic.go:334] "Generic (PLEG): container finished" podID="7354d451-8e24-4c65-855c-e6a33c66d134" containerID="b879983169f6e81e47d42fd66f6b22638a77f5866a9a10fe308fe51064fab9ad" exitCode=0 Mar 17 01:51:05 crc kubenswrapper[4735]: I0317 01:51:05.126752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" event={"ID":"7354d451-8e24-4c65-855c-e6a33c66d134","Type":"ContainerDied","Data":"b879983169f6e81e47d42fd66f6b22638a77f5866a9a10fe308fe51064fab9ad"} Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.795995 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.928984 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam\") pod \"7354d451-8e24-4c65-855c-e6a33c66d134\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.929131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0\") pod \"7354d451-8e24-4c65-855c-e6a33c66d134\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.929152 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6nd\" (UniqueName: \"kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd\") pod \"7354d451-8e24-4c65-855c-e6a33c66d134\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.929283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle\") pod \"7354d451-8e24-4c65-855c-e6a33c66d134\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.929325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory\") pod \"7354d451-8e24-4c65-855c-e6a33c66d134\" (UID: \"7354d451-8e24-4c65-855c-e6a33c66d134\") " Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.937480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7354d451-8e24-4c65-855c-e6a33c66d134" (UID: "7354d451-8e24-4c65-855c-e6a33c66d134"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.937640 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd" (OuterVolumeSpecName: "kube-api-access-lx6nd") pod "7354d451-8e24-4c65-855c-e6a33c66d134" (UID: "7354d451-8e24-4c65-855c-e6a33c66d134"). InnerVolumeSpecName "kube-api-access-lx6nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.957237 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7354d451-8e24-4c65-855c-e6a33c66d134" (UID: "7354d451-8e24-4c65-855c-e6a33c66d134"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.957703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7354d451-8e24-4c65-855c-e6a33c66d134" (UID: "7354d451-8e24-4c65-855c-e6a33c66d134"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:51:06 crc kubenswrapper[4735]: I0317 01:51:06.979323 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory" (OuterVolumeSpecName: "inventory") pod "7354d451-8e24-4c65-855c-e6a33c66d134" (UID: "7354d451-8e24-4c65-855c-e6a33c66d134"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.031255 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.031286 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx6nd\" (UniqueName: \"kubernetes.io/projected/7354d451-8e24-4c65-855c-e6a33c66d134-kube-api-access-lx6nd\") on node \"crc\" DevicePath \"\"" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.031298 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.031308 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.031316 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7354d451-8e24-4c65-855c-e6a33c66d134-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.151375 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" event={"ID":"7354d451-8e24-4c65-855c-e6a33c66d134","Type":"ContainerDied","Data":"2ccadc829269cfac786292b075966236f720b24d91b00e13cbd15d33f63a4259"} Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.151408 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ccadc829269cfac786292b075966236f720b24d91b00e13cbd15d33f63a4259" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.151492 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vswmz" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.257614 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf"] Mar 17 01:51:07 crc kubenswrapper[4735]: E0317 01:51:07.258399 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7354d451-8e24-4c65-855c-e6a33c66d134" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.258426 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7354d451-8e24-4c65-855c-e6a33c66d134" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:51:07 crc kubenswrapper[4735]: E0317 01:51:07.258458 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" containerName="oc" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.258467 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" containerName="oc" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.258728 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" containerName="oc" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.258769 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7354d451-8e24-4c65-855c-e6a33c66d134" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.259610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.265700 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.265740 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.265752 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.265709 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.269754 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.270393 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.272980 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.307718 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf"] Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349598 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349726 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349845 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.349980 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.451213 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.451276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.451666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.451723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.452574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.453577 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.456108 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.456200 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.456397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.457371 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.458229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.458490 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.458631 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.458950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.460236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.466392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8s2pf\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:07 crc kubenswrapper[4735]: I0317 01:51:07.576541 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:51:08 crc kubenswrapper[4735]: I0317 01:51:08.274548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf"] Mar 17 01:51:09 crc kubenswrapper[4735]: I0317 01:51:09.177933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" event={"ID":"69e281de-fd13-450e-acf8-ee5f0561f0b9","Type":"ContainerStarted","Data":"ca6011313c220dbdc7934c066c48ffa4e01ac912b24f7f5c375973ade51eb4d0"} Mar 17 01:51:09 crc kubenswrapper[4735]: I0317 01:51:09.178731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" event={"ID":"69e281de-fd13-450e-acf8-ee5f0561f0b9","Type":"ContainerStarted","Data":"4b1de088bc84c380e77f2b6df7221cf4dc0bb314b0cd80e5088502c742368db2"} Mar 17 01:51:09 crc kubenswrapper[4735]: I0317 01:51:09.208938 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" podStartSLOduration=1.7494874999999999 podStartE2EDuration="2.208918568s" podCreationTimestamp="2026-03-17 01:51:07 +0000 UTC" firstStartedPulling="2026-03-17 01:51:08.289690742 +0000 UTC m=+2493.921923770" lastFinishedPulling="2026-03-17 01:51:08.74912185 +0000 UTC m=+2494.381354838" observedRunningTime="2026-03-17 01:51:09.206347726 +0000 UTC m=+2494.838580734" watchObservedRunningTime="2026-03-17 01:51:09.208918568 +0000 UTC m=+2494.841151556" Mar 17 01:51:12 crc kubenswrapper[4735]: I0317 01:51:12.074850 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:51:12 crc kubenswrapper[4735]: E0317 01:51:12.077159 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:51:25 crc kubenswrapper[4735]: I0317 01:51:25.081319 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:51:25 crc kubenswrapper[4735]: E0317 01:51:25.085593 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:51:38 crc kubenswrapper[4735]: I0317 01:51:38.073350 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:51:38 crc kubenswrapper[4735]: E0317 01:51:38.074181 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:51:49 crc kubenswrapper[4735]: I0317 01:51:49.073999 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:51:49 crc kubenswrapper[4735]: E0317 01:51:49.077396 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.152439 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561872-7547z"] Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.154251 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.157107 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.157326 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.158398 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.170457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-7547z"] Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.208281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfcw\" (UniqueName: \"kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw\") pod \"auto-csr-approver-29561872-7547z\" (UID: \"f699f9a1-dc18-45ce-9c5f-2c384e8f936e\") " pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.309868 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfcw\" (UniqueName: \"kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw\") pod \"auto-csr-approver-29561872-7547z\" (UID: \"f699f9a1-dc18-45ce-9c5f-2c384e8f936e\") " pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.326967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfcw\" (UniqueName: \"kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw\") pod \"auto-csr-approver-29561872-7547z\" (UID: \"f699f9a1-dc18-45ce-9c5f-2c384e8f936e\") " pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.470706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:00 crc kubenswrapper[4735]: I0317 01:52:00.931963 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-7547z"] Mar 17 01:52:01 crc kubenswrapper[4735]: I0317 01:52:01.755026 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-7547z" event={"ID":"f699f9a1-dc18-45ce-9c5f-2c384e8f936e","Type":"ContainerStarted","Data":"87e9f0a0ae667cfc3d7e4dd30819e4239d79f770157971ddd5201a96edaa654f"} Mar 17 01:52:02 crc kubenswrapper[4735]: I0317 01:52:02.073428 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:52:02 crc kubenswrapper[4735]: E0317 01:52:02.073901 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:52:02 crc kubenswrapper[4735]: I0317 01:52:02.763713 4735 generic.go:334] "Generic (PLEG): container finished" podID="f699f9a1-dc18-45ce-9c5f-2c384e8f936e" containerID="7f6c9a5d71acfe633170a0083ffb1806b9378326984d02224977d346a36aa55f" exitCode=0 Mar 17 01:52:02 crc kubenswrapper[4735]: I0317 01:52:02.763779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-7547z" event={"ID":"f699f9a1-dc18-45ce-9c5f-2c384e8f936e","Type":"ContainerDied","Data":"7f6c9a5d71acfe633170a0083ffb1806b9378326984d02224977d346a36aa55f"} Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.189737 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.383614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvfcw\" (UniqueName: \"kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw\") pod \"f699f9a1-dc18-45ce-9c5f-2c384e8f936e\" (UID: \"f699f9a1-dc18-45ce-9c5f-2c384e8f936e\") " Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.388384 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw" (OuterVolumeSpecName: "kube-api-access-nvfcw") pod "f699f9a1-dc18-45ce-9c5f-2c384e8f936e" (UID: "f699f9a1-dc18-45ce-9c5f-2c384e8f936e"). InnerVolumeSpecName "kube-api-access-nvfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.486587 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvfcw\" (UniqueName: \"kubernetes.io/projected/f699f9a1-dc18-45ce-9c5f-2c384e8f936e-kube-api-access-nvfcw\") on node \"crc\" DevicePath \"\"" Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.785689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-7547z" event={"ID":"f699f9a1-dc18-45ce-9c5f-2c384e8f936e","Type":"ContainerDied","Data":"87e9f0a0ae667cfc3d7e4dd30819e4239d79f770157971ddd5201a96edaa654f"} Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.785745 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e9f0a0ae667cfc3d7e4dd30819e4239d79f770157971ddd5201a96edaa654f" Mar 17 01:52:04 crc kubenswrapper[4735]: I0317 01:52:04.785742 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-7547z" Mar 17 01:52:05 crc kubenswrapper[4735]: I0317 01:52:05.309019 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-85mpv"] Mar 17 01:52:05 crc kubenswrapper[4735]: I0317 01:52:05.315801 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-85mpv"] Mar 17 01:52:07 crc kubenswrapper[4735]: I0317 01:52:07.090552 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697cb47c-75db-4d5e-a348-23fd7b455b06" path="/var/lib/kubelet/pods/697cb47c-75db-4d5e-a348-23fd7b455b06/volumes" Mar 17 01:52:13 crc kubenswrapper[4735]: I0317 01:52:13.073921 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:52:13 crc kubenswrapper[4735]: E0317 01:52:13.074681 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:52:28 crc kubenswrapper[4735]: I0317 01:52:28.073527 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:52:28 crc kubenswrapper[4735]: E0317 01:52:28.074508 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:52:43 crc kubenswrapper[4735]: I0317 01:52:43.073605 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:52:44 crc kubenswrapper[4735]: I0317 01:52:44.218489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5"} Mar 17 01:52:55 crc kubenswrapper[4735]: I0317 01:52:55.704228 4735 scope.go:117] "RemoveContainer" containerID="a7a5ad6a80ead65cdbd8601b4c4b351dee2206b4c56ef38425df744d4a3a51e7" Mar 17 01:53:47 crc kubenswrapper[4735]: I0317 01:53:47.913337 4735 generic.go:334] "Generic (PLEG): container finished" podID="69e281de-fd13-450e-acf8-ee5f0561f0b9" containerID="ca6011313c220dbdc7934c066c48ffa4e01ac912b24f7f5c375973ade51eb4d0" exitCode=0 Mar 17 01:53:47 crc kubenswrapper[4735]: I0317 01:53:47.913424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" event={"ID":"69e281de-fd13-450e-acf8-ee5f0561f0b9","Type":"ContainerDied","Data":"ca6011313c220dbdc7934c066c48ffa4e01ac912b24f7f5c375973ade51eb4d0"} Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.365975 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.454875 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.454911 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.454934 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.454958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.454988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455029 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455081 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455137 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455191 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.455211 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle\") pod \"69e281de-fd13-450e-acf8-ee5f0561f0b9\" (UID: \"69e281de-fd13-450e-acf8-ee5f0561f0b9\") " Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.463767 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.474254 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72" (OuterVolumeSpecName: "kube-api-access-nxm72") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "kube-api-access-nxm72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.485504 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.490534 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory" (OuterVolumeSpecName: "inventory") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.497074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.511080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.513632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.514462 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.514716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.515720 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.519730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "69e281de-fd13-450e-acf8-ee5f0561f0b9" (UID: "69e281de-fd13-450e-acf8-ee5f0561f0b9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557126 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxm72\" (UniqueName: \"kubernetes.io/projected/69e281de-fd13-450e-acf8-ee5f0561f0b9-kube-api-access-nxm72\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557241 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557298 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557360 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557427 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557520 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557601 4735 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557653 4735 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557707 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557756 4735 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.557805 4735 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69e281de-fd13-450e-acf8-ee5f0561f0b9-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.940723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" event={"ID":"69e281de-fd13-450e-acf8-ee5f0561f0b9","Type":"ContainerDied","Data":"4b1de088bc84c380e77f2b6df7221cf4dc0bb314b0cd80e5088502c742368db2"} Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.940785 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b1de088bc84c380e77f2b6df7221cf4dc0bb314b0cd80e5088502c742368db2" Mar 17 01:53:49 crc kubenswrapper[4735]: I0317 01:53:49.940812 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8s2pf" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.094024 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs"] Mar 17 01:53:50 crc kubenswrapper[4735]: E0317 01:53:50.094475 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e281de-fd13-450e-acf8-ee5f0561f0b9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.094490 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e281de-fd13-450e-acf8-ee5f0561f0b9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 01:53:50 crc kubenswrapper[4735]: E0317 01:53:50.094516 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f699f9a1-dc18-45ce-9c5f-2c384e8f936e" containerName="oc" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.094522 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f699f9a1-dc18-45ce-9c5f-2c384e8f936e" containerName="oc" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.094671 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f699f9a1-dc18-45ce-9c5f-2c384e8f936e" containerName="oc" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.094688 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e281de-fd13-450e-acf8-ee5f0561f0b9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.095255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.097672 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.097730 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.097800 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.098351 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9c5cs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.098690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.114699 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs"] Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.172608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.172654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.172678 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.172890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbbz\" (UniqueName: \"kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.173005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.173219 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.173293 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275230 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbbz\" (UniqueName: \"kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275484 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.275503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.280625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.280846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.280939 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.281724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.282733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.283116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.294241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbbz\" (UniqueName: \"kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.410054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.982069 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs"] Mar 17 01:53:50 crc kubenswrapper[4735]: I0317 01:53:50.992968 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:53:51 crc kubenswrapper[4735]: I0317 01:53:51.961518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" event={"ID":"d4354aba-95a7-4d25-a2e5-5935a961a0d1","Type":"ContainerStarted","Data":"9bb305ffd2e5cf5e2a915bb66ad8fb261003bdd360b9147a93d2172037a67e32"} Mar 17 01:53:51 crc kubenswrapper[4735]: I0317 01:53:51.961812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" event={"ID":"d4354aba-95a7-4d25-a2e5-5935a961a0d1","Type":"ContainerStarted","Data":"a9a97257d08d1113ec8b275ba4367223f1d512e099b6f1b9dc674430cdada0c5"} Mar 17 01:53:52 crc kubenswrapper[4735]: I0317 01:53:52.000442 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" podStartSLOduration=1.482507673 podStartE2EDuration="2.000423934s" podCreationTimestamp="2026-03-17 01:53:50 +0000 UTC" firstStartedPulling="2026-03-17 01:53:50.992748622 +0000 UTC m=+2656.624981600" lastFinishedPulling="2026-03-17 01:53:51.510664883 +0000 UTC m=+2657.142897861" observedRunningTime="2026-03-17 01:53:51.996283604 +0000 UTC m=+2657.628516622" watchObservedRunningTime="2026-03-17 01:53:52.000423934 +0000 UTC m=+2657.632656912" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.157563 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561874-58lx6"] Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.160115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.163391 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.163760 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.164088 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.166917 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-58lx6"] Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.224377 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5h6q\" (UniqueName: \"kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q\") pod \"auto-csr-approver-29561874-58lx6\" (UID: \"4100711d-cc79-4277-bc28-f8c6cc56bfb0\") " pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.327729 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5h6q\" (UniqueName: \"kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q\") pod \"auto-csr-approver-29561874-58lx6\" (UID: \"4100711d-cc79-4277-bc28-f8c6cc56bfb0\") " pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.351021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5h6q\" (UniqueName: \"kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q\") pod \"auto-csr-approver-29561874-58lx6\" (UID: \"4100711d-cc79-4277-bc28-f8c6cc56bfb0\") " pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.499211 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:00 crc kubenswrapper[4735]: I0317 01:54:00.981643 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-58lx6"] Mar 17 01:54:01 crc kubenswrapper[4735]: I0317 01:54:01.086286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-58lx6" event={"ID":"4100711d-cc79-4277-bc28-f8c6cc56bfb0","Type":"ContainerStarted","Data":"9b2c2764d2c4e59e3a411b3609fdec837f27387af4f416cb1b055a1400650388"} Mar 17 01:54:03 crc kubenswrapper[4735]: I0317 01:54:03.130496 4735 generic.go:334] "Generic (PLEG): container finished" podID="4100711d-cc79-4277-bc28-f8c6cc56bfb0" containerID="dfbbdfad9b6c33022c340310b793334bf0ebcb3c23323c15df1bafe91acbffc1" exitCode=0 Mar 17 01:54:03 crc kubenswrapper[4735]: I0317 01:54:03.130614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-58lx6" event={"ID":"4100711d-cc79-4277-bc28-f8c6cc56bfb0","Type":"ContainerDied","Data":"dfbbdfad9b6c33022c340310b793334bf0ebcb3c23323c15df1bafe91acbffc1"} Mar 17 01:54:04 crc kubenswrapper[4735]: I0317 01:54:04.541160 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:04 crc kubenswrapper[4735]: I0317 01:54:04.614474 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5h6q\" (UniqueName: \"kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q\") pod \"4100711d-cc79-4277-bc28-f8c6cc56bfb0\" (UID: \"4100711d-cc79-4277-bc28-f8c6cc56bfb0\") " Mar 17 01:54:04 crc kubenswrapper[4735]: I0317 01:54:04.621689 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q" (OuterVolumeSpecName: "kube-api-access-g5h6q") pod "4100711d-cc79-4277-bc28-f8c6cc56bfb0" (UID: "4100711d-cc79-4277-bc28-f8c6cc56bfb0"). InnerVolumeSpecName "kube-api-access-g5h6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:54:04 crc kubenswrapper[4735]: I0317 01:54:04.717017 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5h6q\" (UniqueName: \"kubernetes.io/projected/4100711d-cc79-4277-bc28-f8c6cc56bfb0-kube-api-access-g5h6q\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:05 crc kubenswrapper[4735]: I0317 01:54:05.158018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-58lx6" event={"ID":"4100711d-cc79-4277-bc28-f8c6cc56bfb0","Type":"ContainerDied","Data":"9b2c2764d2c4e59e3a411b3609fdec837f27387af4f416cb1b055a1400650388"} Mar 17 01:54:05 crc kubenswrapper[4735]: I0317 01:54:05.158076 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2c2764d2c4e59e3a411b3609fdec837f27387af4f416cb1b055a1400650388" Mar 17 01:54:05 crc kubenswrapper[4735]: I0317 01:54:05.158159 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-58lx6" Mar 17 01:54:05 crc kubenswrapper[4735]: I0317 01:54:05.633706 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-d8m6x"] Mar 17 01:54:05 crc kubenswrapper[4735]: I0317 01:54:05.646672 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-d8m6x"] Mar 17 01:54:07 crc kubenswrapper[4735]: I0317 01:54:07.086101 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc36276-ff9c-47ff-b33f-42d47c854281" path="/var/lib/kubelet/pods/0dc36276-ff9c-47ff-b33f-42d47c854281/volumes" Mar 17 01:54:55 crc kubenswrapper[4735]: I0317 01:54:55.836064 4735 scope.go:117] "RemoveContainer" containerID="65632f82533e9d73c42e3c52a275a1e91c14dc2adee7801d431ee32028e940c0" Mar 17 01:55:12 crc kubenswrapper[4735]: I0317 01:55:12.606330 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:55:12 crc kubenswrapper[4735]: I0317 01:55:12.607100 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:55:42 crc kubenswrapper[4735]: I0317 01:55:42.606565 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:55:42 crc kubenswrapper[4735]: I0317 01:55:42.607316 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:55:55 crc kubenswrapper[4735]: I0317 01:55:55.942244 4735 scope.go:117] "RemoveContainer" containerID="d1e448a9e1a71d783ef2cf54162ea2a6ca09d53096efab5fe267e1ee207f912b" Mar 17 01:55:55 crc kubenswrapper[4735]: I0317 01:55:55.971275 4735 scope.go:117] "RemoveContainer" containerID="ed57edd4cfdef69880e7607d5e347d1e8f5d5dee9468fd0cd162931af53cc5c9" Mar 17 01:55:55 crc kubenswrapper[4735]: I0317 01:55:55.994882 4735 scope.go:117] "RemoveContainer" containerID="9fc7332aff6c6e95b32f0d0aadf77961cffa46bebe63b7e964299d5afb14158c" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.148036 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561876-z6872"] Mar 17 01:56:00 crc kubenswrapper[4735]: E0317 01:56:00.148914 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100711d-cc79-4277-bc28-f8c6cc56bfb0" containerName="oc" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.148925 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100711d-cc79-4277-bc28-f8c6cc56bfb0" containerName="oc" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.149128 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100711d-cc79-4277-bc28-f8c6cc56bfb0" containerName="oc" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.149703 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.152674 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.152845 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.153151 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.162364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-z6872"] Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.329838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6qng\" (UniqueName: \"kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng\") pod \"auto-csr-approver-29561876-z6872\" (UID: \"1313a60f-d73a-4de6-8750-8f2563aba379\") " pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.432841 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6qng\" (UniqueName: \"kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng\") pod \"auto-csr-approver-29561876-z6872\" (UID: \"1313a60f-d73a-4de6-8750-8f2563aba379\") " pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.459914 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6qng\" (UniqueName: \"kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng\") pod \"auto-csr-approver-29561876-z6872\" (UID: \"1313a60f-d73a-4de6-8750-8f2563aba379\") " pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.476745 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:00 crc kubenswrapper[4735]: I0317 01:56:00.964525 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-z6872"] Mar 17 01:56:01 crc kubenswrapper[4735]: I0317 01:56:01.505980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-z6872" event={"ID":"1313a60f-d73a-4de6-8750-8f2563aba379","Type":"ContainerStarted","Data":"d77939859cfb2c1b1409296e1a867a9dba3fb1b74a3e5971a5a9612de541db66"} Mar 17 01:56:02 crc kubenswrapper[4735]: I0317 01:56:02.516261 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-z6872" event={"ID":"1313a60f-d73a-4de6-8750-8f2563aba379","Type":"ContainerStarted","Data":"a594d9cc36e0b84ca23100eccd08f746276944a140a785d75701be9fe908fb42"} Mar 17 01:56:02 crc kubenswrapper[4735]: I0317 01:56:02.530900 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561876-z6872" podStartSLOduration=1.355684889 podStartE2EDuration="2.530879967s" podCreationTimestamp="2026-03-17 01:56:00 +0000 UTC" firstStartedPulling="2026-03-17 01:56:00.982736999 +0000 UTC m=+2786.614969977" lastFinishedPulling="2026-03-17 01:56:02.157932077 +0000 UTC m=+2787.790165055" observedRunningTime="2026-03-17 01:56:02.527090255 +0000 UTC m=+2788.159323233" watchObservedRunningTime="2026-03-17 01:56:02.530879967 +0000 UTC m=+2788.163112945" Mar 17 01:56:03 crc kubenswrapper[4735]: I0317 01:56:03.526013 4735 generic.go:334] "Generic (PLEG): container finished" podID="1313a60f-d73a-4de6-8750-8f2563aba379" containerID="a594d9cc36e0b84ca23100eccd08f746276944a140a785d75701be9fe908fb42" exitCode=0 Mar 17 01:56:03 crc kubenswrapper[4735]: I0317 01:56:03.526063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-z6872" event={"ID":"1313a60f-d73a-4de6-8750-8f2563aba379","Type":"ContainerDied","Data":"a594d9cc36e0b84ca23100eccd08f746276944a140a785d75701be9fe908fb42"} Mar 17 01:56:04 crc kubenswrapper[4735]: I0317 01:56:04.904661 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.029673 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6qng\" (UniqueName: \"kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng\") pod \"1313a60f-d73a-4de6-8750-8f2563aba379\" (UID: \"1313a60f-d73a-4de6-8750-8f2563aba379\") " Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.038428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng" (OuterVolumeSpecName: "kube-api-access-s6qng") pod "1313a60f-d73a-4de6-8750-8f2563aba379" (UID: "1313a60f-d73a-4de6-8750-8f2563aba379"). InnerVolumeSpecName "kube-api-access-s6qng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.132724 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6qng\" (UniqueName: \"kubernetes.io/projected/1313a60f-d73a-4de6-8750-8f2563aba379-kube-api-access-s6qng\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.561629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-z6872" event={"ID":"1313a60f-d73a-4de6-8750-8f2563aba379","Type":"ContainerDied","Data":"d77939859cfb2c1b1409296e1a867a9dba3fb1b74a3e5971a5a9612de541db66"} Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.563479 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d77939859cfb2c1b1409296e1a867a9dba3fb1b74a3e5971a5a9612de541db66" Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.563559 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-z6872" Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.619852 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-j7g2b"] Mar 17 01:56:05 crc kubenswrapper[4735]: I0317 01:56:05.627609 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-j7g2b"] Mar 17 01:56:07 crc kubenswrapper[4735]: I0317 01:56:07.085540 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0" path="/var/lib/kubelet/pods/df4bdd64-9b75-4ca1-b1ae-543fa38cd0b0/volumes" Mar 17 01:56:12 crc kubenswrapper[4735]: I0317 01:56:12.606122 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:56:12 crc kubenswrapper[4735]: I0317 01:56:12.606846 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:56:12 crc kubenswrapper[4735]: I0317 01:56:12.606926 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:56:12 crc kubenswrapper[4735]: I0317 01:56:12.607806 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:56:12 crc kubenswrapper[4735]: I0317 01:56:12.607889 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5" gracePeriod=600 Mar 17 01:56:13 crc kubenswrapper[4735]: I0317 01:56:13.641142 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5" exitCode=0 Mar 17 01:56:13 crc kubenswrapper[4735]: I0317 01:56:13.641228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5"} Mar 17 01:56:13 crc kubenswrapper[4735]: I0317 01:56:13.641815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb"} Mar 17 01:56:13 crc kubenswrapper[4735]: I0317 01:56:13.641849 4735 scope.go:117] "RemoveContainer" containerID="8c328f45767d8475858013c348bf8f6b61817ce5d08e7487510ff060e6221abb" Mar 17 01:56:56 crc kubenswrapper[4735]: I0317 01:56:56.052381 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4354aba-95a7-4d25-a2e5-5935a961a0d1" containerID="9bb305ffd2e5cf5e2a915bb66ad8fb261003bdd360b9147a93d2172037a67e32" exitCode=0 Mar 17 01:56:56 crc kubenswrapper[4735]: I0317 01:56:56.052478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" event={"ID":"d4354aba-95a7-4d25-a2e5-5935a961a0d1","Type":"ContainerDied","Data":"9bb305ffd2e5cf5e2a915bb66ad8fb261003bdd360b9147a93d2172037a67e32"} Mar 17 01:56:56 crc kubenswrapper[4735]: I0317 01:56:56.088754 4735 scope.go:117] "RemoveContainer" containerID="3bf10ec074e90ee94d758d494c4ef7edca760a3f859794ac487c8bc71bccb628" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.512581 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699107 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699266 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699516 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncbbz\" (UniqueName: \"kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.699713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam\") pod \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\" (UID: \"d4354aba-95a7-4d25-a2e5-5935a961a0d1\") " Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.705700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.705737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz" (OuterVolumeSpecName: "kube-api-access-ncbbz") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "kube-api-access-ncbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.727306 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.744338 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory" (OuterVolumeSpecName: "inventory") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.745248 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.746352 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.749113 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d4354aba-95a7-4d25-a2e5-5935a961a0d1" (UID: "d4354aba-95a7-4d25-a2e5-5935a961a0d1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802084 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802113 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802123 4735 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802133 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802141 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncbbz\" (UniqueName: \"kubernetes.io/projected/d4354aba-95a7-4d25-a2e5-5935a961a0d1-kube-api-access-ncbbz\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802151 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:57 crc kubenswrapper[4735]: I0317 01:56:57.802161 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d4354aba-95a7-4d25-a2e5-5935a961a0d1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:58 crc kubenswrapper[4735]: I0317 01:56:58.073745 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" event={"ID":"d4354aba-95a7-4d25-a2e5-5935a961a0d1","Type":"ContainerDied","Data":"a9a97257d08d1113ec8b275ba4367223f1d512e099b6f1b9dc674430cdada0c5"} Mar 17 01:56:58 crc kubenswrapper[4735]: I0317 01:56:58.073775 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a97257d08d1113ec8b275ba4367223f1d512e099b6f1b9dc674430cdada0c5" Mar 17 01:56:58 crc kubenswrapper[4735]: I0317 01:56:58.073828 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs" Mar 17 01:57:39 crc kubenswrapper[4735]: E0317 01:57:39.344967 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:57948->38.102.83.65:40841: write tcp 38.102.83.65:57948->38.102.83.65:40841: write: broken pipe Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.063157 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Mar 17 01:57:59 crc kubenswrapper[4735]: E0317 01:57:59.064439 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4354aba-95a7-4d25-a2e5-5935a961a0d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.064457 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4354aba-95a7-4d25-a2e5-5935a961a0d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:57:59 crc kubenswrapper[4735]: E0317 01:57:59.064473 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1313a60f-d73a-4de6-8750-8f2563aba379" containerName="oc" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.064507 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1313a60f-d73a-4de6-8750-8f2563aba379" containerName="oc" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.064847 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1313a60f-d73a-4de6-8750-8f2563aba379" containerName="oc" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.064898 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4354aba-95a7-4d25-a2e5-5935a961a0d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.065996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.069914 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.070140 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.070432 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mrnrm" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.072057 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.096561 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.173830 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.173967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174084 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174241 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174305 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.174490 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9td\" (UniqueName: \"kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9td\" (UniqueName: \"kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276387 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276530 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.276992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.277323 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.279106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.279437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.280239 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.285837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.286589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.288102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.299604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9td\" (UniqueName: \"kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.321882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.441505 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 01:57:59 crc kubenswrapper[4735]: I0317 01:57:59.958871 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.143562 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561878-psf9k"] Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.145036 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.151830 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.152102 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.152230 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.155955 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-psf9k"] Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.195446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbzw\" (UniqueName: \"kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw\") pod \"auto-csr-approver-29561878-psf9k\" (UID: \"692e77e5-b3c6-4605-b9c1-3e0ad803a90c\") " pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.297699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbzw\" (UniqueName: \"kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw\") pod \"auto-csr-approver-29561878-psf9k\" (UID: \"692e77e5-b3c6-4605-b9c1-3e0ad803a90c\") " pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.316736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbzw\" (UniqueName: \"kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw\") pod \"auto-csr-approver-29561878-psf9k\" (UID: \"692e77e5-b3c6-4605-b9c1-3e0ad803a90c\") " pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.470254 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.685576 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d","Type":"ContainerStarted","Data":"afe0e91de4e645b00e7ef63aa7a2ca38af275e9da46c94a7ab795a3f429aef73"} Mar 17 01:58:00 crc kubenswrapper[4735]: I0317 01:58:00.902753 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-psf9k"] Mar 17 01:58:04 crc kubenswrapper[4735]: I0317 01:58:04.757886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-psf9k" event={"ID":"692e77e5-b3c6-4605-b9c1-3e0ad803a90c","Type":"ContainerStarted","Data":"778c6ff0e00517f754c398e6948aa7b25770f11976c841b8d3936b13ad1663fb"} Mar 17 01:58:10 crc kubenswrapper[4735]: I0317 01:58:10.828137 4735 generic.go:334] "Generic (PLEG): container finished" podID="692e77e5-b3c6-4605-b9c1-3e0ad803a90c" containerID="5dae8c5f457463687011df4b32e20ebe770b4aaee1ea0e654fbfdb1d68a03333" exitCode=0 Mar 17 01:58:10 crc kubenswrapper[4735]: I0317 01:58:10.828301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-psf9k" event={"ID":"692e77e5-b3c6-4605-b9c1-3e0ad803a90c","Type":"ContainerDied","Data":"5dae8c5f457463687011df4b32e20ebe770b4aaee1ea0e654fbfdb1d68a03333"} Mar 17 01:58:12 crc kubenswrapper[4735]: I0317 01:58:12.606967 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:58:12 crc kubenswrapper[4735]: I0317 01:58:12.607962 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.145162 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.148058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.190711 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.247820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.248290 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.248375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzq67\" (UniqueName: \"kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.351452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.351494 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzq67\" (UniqueName: \"kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.351546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.352367 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.352583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.374243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzq67\" (UniqueName: \"kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67\") pod \"redhat-marketplace-4sqwr\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:25 crc kubenswrapper[4735]: I0317 01:58:25.494993 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:32 crc kubenswrapper[4735]: E0317 01:58:32.400278 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.39:5001/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:58:32 crc kubenswrapper[4735]: E0317 01:58:32.400797 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.39:5001/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9" Mar 17 01:58:32 crc kubenswrapper[4735]: E0317 01:58:32.404550 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:38.102.83.39:5001/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kq9td,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-multi-thread-testing_openstack(07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 01:58:32 crc kubenswrapper[4735]: E0317 01:58:32.406674 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.423770 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.505446 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-psf9k" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.505451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-psf9k" event={"ID":"692e77e5-b3c6-4605-b9c1-3e0ad803a90c","Type":"ContainerDied","Data":"778c6ff0e00517f754c398e6948aa7b25770f11976c841b8d3936b13ad1663fb"} Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.505497 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="778c6ff0e00517f754c398e6948aa7b25770f11976c841b8d3936b13ad1663fb" Mar 17 01:58:32 crc kubenswrapper[4735]: E0317 01:58:32.508391 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.39:5001/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.523532 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbzw\" (UniqueName: \"kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw\") pod \"692e77e5-b3c6-4605-b9c1-3e0ad803a90c\" (UID: \"692e77e5-b3c6-4605-b9c1-3e0ad803a90c\") " Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.542213 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw" (OuterVolumeSpecName: "kube-api-access-dpbzw") pod "692e77e5-b3c6-4605-b9c1-3e0ad803a90c" (UID: "692e77e5-b3c6-4605-b9c1-3e0ad803a90c"). InnerVolumeSpecName "kube-api-access-dpbzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.626780 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbzw\" (UniqueName: \"kubernetes.io/projected/692e77e5-b3c6-4605-b9c1-3e0ad803a90c-kube-api-access-dpbzw\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:32 crc kubenswrapper[4735]: I0317 01:58:32.834028 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:33 crc kubenswrapper[4735]: I0317 01:58:33.541640 4735 generic.go:334] "Generic (PLEG): container finished" podID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerID="5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa" exitCode=0 Mar 17 01:58:33 crc kubenswrapper[4735]: I0317 01:58:33.541962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerDied","Data":"5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa"} Mar 17 01:58:33 crc kubenswrapper[4735]: I0317 01:58:33.541992 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerStarted","Data":"2ee08f23f4a8cd55877d9fb587e6dd119a3acf1d1e7dde5105a31c51329064f1"} Mar 17 01:58:33 crc kubenswrapper[4735]: I0317 01:58:33.570196 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-7547z"] Mar 17 01:58:33 crc kubenswrapper[4735]: I0317 01:58:33.584280 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-7547z"] Mar 17 01:58:34 crc kubenswrapper[4735]: I0317 01:58:34.558545 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerStarted","Data":"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370"} Mar 17 01:58:35 crc kubenswrapper[4735]: I0317 01:58:35.103277 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f699f9a1-dc18-45ce-9c5f-2c384e8f936e" path="/var/lib/kubelet/pods/f699f9a1-dc18-45ce-9c5f-2c384e8f936e/volumes" Mar 17 01:58:35 crc kubenswrapper[4735]: I0317 01:58:35.569620 4735 generic.go:334] "Generic (PLEG): container finished" podID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerID="0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370" exitCode=0 Mar 17 01:58:35 crc kubenswrapper[4735]: I0317 01:58:35.569667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerDied","Data":"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370"} Mar 17 01:58:36 crc kubenswrapper[4735]: I0317 01:58:36.584061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerStarted","Data":"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1"} Mar 17 01:58:36 crc kubenswrapper[4735]: I0317 01:58:36.623378 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4sqwr" podStartSLOduration=9.110390977 podStartE2EDuration="11.623352425s" podCreationTimestamp="2026-03-17 01:58:25 +0000 UTC" firstStartedPulling="2026-03-17 01:58:33.546730947 +0000 UTC m=+2939.178963925" lastFinishedPulling="2026-03-17 01:58:36.059692365 +0000 UTC m=+2941.691925373" observedRunningTime="2026-03-17 01:58:36.616091338 +0000 UTC m=+2942.248324336" watchObservedRunningTime="2026-03-17 01:58:36.623352425 +0000 UTC m=+2942.255585413" Mar 17 01:58:42 crc kubenswrapper[4735]: I0317 01:58:42.606983 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:58:42 crc kubenswrapper[4735]: I0317 01:58:42.608927 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:58:45 crc kubenswrapper[4735]: I0317 01:58:45.495814 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:45 crc kubenswrapper[4735]: I0317 01:58:45.496208 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:45 crc kubenswrapper[4735]: I0317 01:58:45.550809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:45 crc kubenswrapper[4735]: I0317 01:58:45.726326 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:45 crc kubenswrapper[4735]: I0317 01:58:45.800416 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:47 crc kubenswrapper[4735]: I0317 01:58:47.696411 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4sqwr" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="registry-server" containerID="cri-o://bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1" gracePeriod=2 Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.146573 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.277302 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.419651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content\") pod \"47b0912b-d1cf-4096-9d36-c5d93e40b294\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.419879 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities\") pod \"47b0912b-d1cf-4096-9d36-c5d93e40b294\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.419932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzq67\" (UniqueName: \"kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67\") pod \"47b0912b-d1cf-4096-9d36-c5d93e40b294\" (UID: \"47b0912b-d1cf-4096-9d36-c5d93e40b294\") " Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.420801 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities" (OuterVolumeSpecName: "utilities") pod "47b0912b-d1cf-4096-9d36-c5d93e40b294" (UID: "47b0912b-d1cf-4096-9d36-c5d93e40b294"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.424734 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67" (OuterVolumeSpecName: "kube-api-access-qzq67") pod "47b0912b-d1cf-4096-9d36-c5d93e40b294" (UID: "47b0912b-d1cf-4096-9d36-c5d93e40b294"). InnerVolumeSpecName "kube-api-access-qzq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.447406 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47b0912b-d1cf-4096-9d36-c5d93e40b294" (UID: "47b0912b-d1cf-4096-9d36-c5d93e40b294"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.522684 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.522721 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzq67\" (UniqueName: \"kubernetes.io/projected/47b0912b-d1cf-4096-9d36-c5d93e40b294-kube-api-access-qzq67\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.522738 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b0912b-d1cf-4096-9d36-c5d93e40b294-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.710726 4735 generic.go:334] "Generic (PLEG): container finished" podID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerID="bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1" exitCode=0 Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.710832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerDied","Data":"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1"} Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.710896 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sqwr" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.711148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sqwr" event={"ID":"47b0912b-d1cf-4096-9d36-c5d93e40b294","Type":"ContainerDied","Data":"2ee08f23f4a8cd55877d9fb587e6dd119a3acf1d1e7dde5105a31c51329064f1"} Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.711193 4735 scope.go:117] "RemoveContainer" containerID="bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.775568 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.784741 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sqwr"] Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.789441 4735 scope.go:117] "RemoveContainer" containerID="0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.806660 4735 scope.go:117] "RemoveContainer" containerID="5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.862476 4735 scope.go:117] "RemoveContainer" containerID="bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1" Mar 17 01:58:48 crc kubenswrapper[4735]: E0317 01:58:48.862875 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1\": container with ID starting with bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1 not found: ID does not exist" containerID="bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.862905 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1"} err="failed to get container status \"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1\": rpc error: code = NotFound desc = could not find container \"bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1\": container with ID starting with bc56816f26508e69cd88fcd2945f03c432f390a4de6ab7fac08d8a961c5d68e1 not found: ID does not exist" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.862929 4735 scope.go:117] "RemoveContainer" containerID="0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370" Mar 17 01:58:48 crc kubenswrapper[4735]: E0317 01:58:48.863152 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370\": container with ID starting with 0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370 not found: ID does not exist" containerID="0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.863178 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370"} err="failed to get container status \"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370\": rpc error: code = NotFound desc = could not find container \"0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370\": container with ID starting with 0298bf6edbb0e1b5ecfd44b14d072879011a8ce90e56503ebe5bd0ed4e082370 not found: ID does not exist" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.863192 4735 scope.go:117] "RemoveContainer" containerID="5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa" Mar 17 01:58:48 crc kubenswrapper[4735]: E0317 01:58:48.863414 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa\": container with ID starting with 5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa not found: ID does not exist" containerID="5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa" Mar 17 01:58:48 crc kubenswrapper[4735]: I0317 01:58:48.863457 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa"} err="failed to get container status \"5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa\": rpc error: code = NotFound desc = could not find container \"5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa\": container with ID starting with 5f33b1199c4e275e93a0089b8d13afe5ee0d03b8f741b876afb1713479fdbdaa not found: ID does not exist" Mar 17 01:58:49 crc kubenswrapper[4735]: I0317 01:58:49.086301 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" path="/var/lib/kubelet/pods/47b0912b-d1cf-4096-9d36-c5d93e40b294/volumes" Mar 17 01:58:49 crc kubenswrapper[4735]: I0317 01:58:49.723381 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d","Type":"ContainerStarted","Data":"6154a9fb8d8bf213e9f3d93edb81056fcbcea600a0ca305807a507cd16047088"} Mar 17 01:58:49 crc kubenswrapper[4735]: I0317 01:58:49.751603 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podStartSLOduration=4.588443219 podStartE2EDuration="52.751577922s" podCreationTimestamp="2026-03-17 01:57:57 +0000 UTC" firstStartedPulling="2026-03-17 01:57:59.978882124 +0000 UTC m=+2905.611115092" lastFinishedPulling="2026-03-17 01:58:48.142016817 +0000 UTC m=+2953.774249795" observedRunningTime="2026-03-17 01:58:49.74366739 +0000 UTC m=+2955.375900368" watchObservedRunningTime="2026-03-17 01:58:49.751577922 +0000 UTC m=+2955.383810890" Mar 17 01:58:56 crc kubenswrapper[4735]: I0317 01:58:56.185965 4735 scope.go:117] "RemoveContainer" containerID="7f6c9a5d71acfe633170a0083ffb1806b9378326984d02224977d346a36aa55f" Mar 17 01:59:12 crc kubenswrapper[4735]: I0317 01:59:12.606030 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:59:12 crc kubenswrapper[4735]: I0317 01:59:12.606521 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:59:12 crc kubenswrapper[4735]: I0317 01:59:12.606568 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 01:59:12 crc kubenswrapper[4735]: I0317 01:59:12.607319 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:59:12 crc kubenswrapper[4735]: I0317 01:59:12.607372 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" gracePeriod=600 Mar 17 01:59:12 crc kubenswrapper[4735]: E0317 01:59:12.739817 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:59:13 crc kubenswrapper[4735]: I0317 01:59:13.040488 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" exitCode=0 Mar 17 01:59:13 crc kubenswrapper[4735]: I0317 01:59:13.040827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb"} Mar 17 01:59:13 crc kubenswrapper[4735]: I0317 01:59:13.040913 4735 scope.go:117] "RemoveContainer" containerID="231fcd5e63fe450140935954db3606d53aac1964da346772df02580b5f31a6b5" Mar 17 01:59:13 crc kubenswrapper[4735]: I0317 01:59:13.042338 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 01:59:13 crc kubenswrapper[4735]: E0317 01:59:13.042948 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:59:27 crc kubenswrapper[4735]: I0317 01:59:27.072926 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 01:59:27 crc kubenswrapper[4735]: E0317 01:59:27.073779 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:59:40 crc kubenswrapper[4735]: I0317 01:59:40.073716 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 01:59:40 crc kubenswrapper[4735]: E0317 01:59:40.074562 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.836015 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 01:59:49 crc kubenswrapper[4735]: E0317 01:59:49.837083 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692e77e5-b3c6-4605-b9c1-3e0ad803a90c" containerName="oc" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837101 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="692e77e5-b3c6-4605-b9c1-3e0ad803a90c" containerName="oc" Mar 17 01:59:49 crc kubenswrapper[4735]: E0317 01:59:49.837132 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="registry-server" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837142 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="registry-server" Mar 17 01:59:49 crc kubenswrapper[4735]: E0317 01:59:49.837156 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="extract-utilities" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837164 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="extract-utilities" Mar 17 01:59:49 crc kubenswrapper[4735]: E0317 01:59:49.837211 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="extract-content" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837218 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="extract-content" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837439 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="692e77e5-b3c6-4605-b9c1-3e0ad803a90c" containerName="oc" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.837460 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b0912b-d1cf-4096-9d36-c5d93e40b294" containerName="registry-server" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.842217 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.885114 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.894924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.895243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7jg\" (UniqueName: \"kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.895371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.997556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.997973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7jg\" (UniqueName: \"kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.998003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.998088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:49 crc kubenswrapper[4735]: I0317 01:59:49.998369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:50 crc kubenswrapper[4735]: I0317 01:59:50.019656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7jg\" (UniqueName: \"kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg\") pod \"redhat-operators-jqll6\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:50 crc kubenswrapper[4735]: I0317 01:59:50.196940 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 01:59:50 crc kubenswrapper[4735]: I0317 01:59:50.960597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 01:59:50 crc kubenswrapper[4735]: W0317 01:59:50.971746 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21605e8_f4e2_43f4_99b1_d0d4958ac72c.slice/crio-a6de1d902252c69ed87bb7eef65f97d096d2f445bce4a91632de4e703f43ac05 WatchSource:0}: Error finding container a6de1d902252c69ed87bb7eef65f97d096d2f445bce4a91632de4e703f43ac05: Status 404 returned error can't find the container with id a6de1d902252c69ed87bb7eef65f97d096d2f445bce4a91632de4e703f43ac05 Mar 17 01:59:51 crc kubenswrapper[4735]: I0317 01:59:51.417553 4735 generic.go:334] "Generic (PLEG): container finished" podID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerID="f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd" exitCode=0 Mar 17 01:59:51 crc kubenswrapper[4735]: I0317 01:59:51.417813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerDied","Data":"f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd"} Mar 17 01:59:51 crc kubenswrapper[4735]: I0317 01:59:51.417840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerStarted","Data":"a6de1d902252c69ed87bb7eef65f97d096d2f445bce4a91632de4e703f43ac05"} Mar 17 01:59:51 crc kubenswrapper[4735]: I0317 01:59:51.421331 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:59:53 crc kubenswrapper[4735]: I0317 01:59:53.437485 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerStarted","Data":"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c"} Mar 17 01:59:54 crc kubenswrapper[4735]: I0317 01:59:54.073723 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 01:59:54 crc kubenswrapper[4735]: E0317 01:59:54.074322 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 01:59:58 crc kubenswrapper[4735]: I0317 01:59:58.479934 4735 generic.go:334] "Generic (PLEG): container finished" podID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerID="70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c" exitCode=0 Mar 17 01:59:58 crc kubenswrapper[4735]: I0317 01:59:58.479996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerDied","Data":"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c"} Mar 17 01:59:59 crc kubenswrapper[4735]: I0317 01:59:59.491485 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerStarted","Data":"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9"} Mar 17 01:59:59 crc kubenswrapper[4735]: I0317 01:59:59.508216 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jqll6" podStartSLOduration=2.986633578 podStartE2EDuration="10.508193267s" podCreationTimestamp="2026-03-17 01:59:49 +0000 UTC" firstStartedPulling="2026-03-17 01:59:51.421094545 +0000 UTC m=+3017.053327523" lastFinishedPulling="2026-03-17 01:59:58.942654184 +0000 UTC m=+3024.574887212" observedRunningTime="2026-03-17 01:59:59.507559352 +0000 UTC m=+3025.139792330" watchObservedRunningTime="2026-03-17 01:59:59.508193267 +0000 UTC m=+3025.140426255" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.178289 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561880-ml84n"] Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.179921 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.189786 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.190014 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.190174 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.194483 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl"] Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.195956 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.197016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.198091 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.203214 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.206406 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.206529 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl"] Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.214409 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-ml84n"] Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.311269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdd59\" (UniqueName: \"kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59\") pod \"auto-csr-approver-29561880-ml84n\" (UID: \"4ab217a3-57e5-4677-9aa2-b35bff4d3584\") " pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.311316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-472nz\" (UniqueName: \"kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.311354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.311566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.413986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdd59\" (UniqueName: \"kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59\") pod \"auto-csr-approver-29561880-ml84n\" (UID: \"4ab217a3-57e5-4677-9aa2-b35bff4d3584\") " pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.414055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-472nz\" (UniqueName: \"kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.414127 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.414219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.415127 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.427737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.433662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdd59\" (UniqueName: \"kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59\") pod \"auto-csr-approver-29561880-ml84n\" (UID: \"4ab217a3-57e5-4677-9aa2-b35bff4d3584\") " pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.448066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-472nz\" (UniqueName: \"kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz\") pod \"collect-profiles-29561880-mgvtl\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.504128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.517850 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:00 crc kubenswrapper[4735]: I0317 02:00:00.986190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl"] Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.256383 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:01 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:01 crc kubenswrapper[4735]: > Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.365762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-ml84n"] Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.511342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-ml84n" event={"ID":"4ab217a3-57e5-4677-9aa2-b35bff4d3584","Type":"ContainerStarted","Data":"85b9dddb1f9f1b0c17ad1f06d7df474cb30f9b3fefcd488451cfa199db3f346e"} Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.515069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" event={"ID":"da359745-5f32-4749-8c55-e19847bae917","Type":"ContainerStarted","Data":"e8f88eb69d915cb8494a9dd8483ff75a20193bddf531ad3e1e82a63c3a06cd9d"} Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.515090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" event={"ID":"da359745-5f32-4749-8c55-e19847bae917","Type":"ContainerStarted","Data":"52b8b17642f932c4d30af3122c264711124f13482aadff1f6b771304982c63b3"} Mar 17 02:00:01 crc kubenswrapper[4735]: I0317 02:00:01.539763 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" podStartSLOduration=1.539743678 podStartE2EDuration="1.539743678s" podCreationTimestamp="2026-03-17 02:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:00:01.528934971 +0000 UTC m=+3027.161167949" watchObservedRunningTime="2026-03-17 02:00:01.539743678 +0000 UTC m=+3027.171976666" Mar 17 02:00:02 crc kubenswrapper[4735]: I0317 02:00:02.522519 4735 generic.go:334] "Generic (PLEG): container finished" podID="da359745-5f32-4749-8c55-e19847bae917" containerID="e8f88eb69d915cb8494a9dd8483ff75a20193bddf531ad3e1e82a63c3a06cd9d" exitCode=0 Mar 17 02:00:02 crc kubenswrapper[4735]: I0317 02:00:02.522615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" event={"ID":"da359745-5f32-4749-8c55-e19847bae917","Type":"ContainerDied","Data":"e8f88eb69d915cb8494a9dd8483ff75a20193bddf531ad3e1e82a63c3a06cd9d"} Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.301297 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.303962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.312230 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.502043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.502126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.502166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmst\" (UniqueName: \"kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.604052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.605072 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.605525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmst\" (UniqueName: \"kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.605444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.604791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.625628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmst\" (UniqueName: \"kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst\") pod \"certified-operators-z7kxj\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.920133 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:03 crc kubenswrapper[4735]: I0317 02:00:03.945690 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.113363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume\") pod \"da359745-5f32-4749-8c55-e19847bae917\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.113725 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-472nz\" (UniqueName: \"kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz\") pod \"da359745-5f32-4749-8c55-e19847bae917\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.113848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume\") pod \"da359745-5f32-4749-8c55-e19847bae917\" (UID: \"da359745-5f32-4749-8c55-e19847bae917\") " Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.114328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume" (OuterVolumeSpecName: "config-volume") pod "da359745-5f32-4749-8c55-e19847bae917" (UID: "da359745-5f32-4749-8c55-e19847bae917"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.120154 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz" (OuterVolumeSpecName: "kube-api-access-472nz") pod "da359745-5f32-4749-8c55-e19847bae917" (UID: "da359745-5f32-4749-8c55-e19847bae917"). InnerVolumeSpecName "kube-api-access-472nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.124645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da359745-5f32-4749-8c55-e19847bae917" (UID: "da359745-5f32-4749-8c55-e19847bae917"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.215617 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da359745-5f32-4749-8c55-e19847bae917-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.215645 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-472nz\" (UniqueName: \"kubernetes.io/projected/da359745-5f32-4749-8c55-e19847bae917-kube-api-access-472nz\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.215655 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da359745-5f32-4749-8c55-e19847bae917-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.410021 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.557085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerStarted","Data":"cdfea41f92c62f5011e64890743e652fff19ba8d578f188d1f787e3177ed4b0e"} Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.568121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" event={"ID":"da359745-5f32-4749-8c55-e19847bae917","Type":"ContainerDied","Data":"52b8b17642f932c4d30af3122c264711124f13482aadff1f6b771304982c63b3"} Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.568159 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b8b17642f932c4d30af3122c264711124f13482aadff1f6b771304982c63b3" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.568254 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl" Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.622077 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl"] Mar 17 02:00:04 crc kubenswrapper[4735]: I0317 02:00:04.629444 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-kwwfl"] Mar 17 02:00:05 crc kubenswrapper[4735]: I0317 02:00:05.084384 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:00:05 crc kubenswrapper[4735]: E0317 02:00:05.084665 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:00:05 crc kubenswrapper[4735]: I0317 02:00:05.085635 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee5ca9d-9692-4474-81a7-226fd9192272" path="/var/lib/kubelet/pods/aee5ca9d-9692-4474-81a7-226fd9192272/volumes" Mar 17 02:00:05 crc kubenswrapper[4735]: I0317 02:00:05.583572 4735 generic.go:334] "Generic (PLEG): container finished" podID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerID="14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627" exitCode=0 Mar 17 02:00:05 crc kubenswrapper[4735]: I0317 02:00:05.583685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerDied","Data":"14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627"} Mar 17 02:00:07 crc kubenswrapper[4735]: I0317 02:00:07.608228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerStarted","Data":"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d"} Mar 17 02:00:09 crc kubenswrapper[4735]: I0317 02:00:09.651709 4735 generic.go:334] "Generic (PLEG): container finished" podID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerID="3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d" exitCode=0 Mar 17 02:00:09 crc kubenswrapper[4735]: I0317 02:00:09.651789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerDied","Data":"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d"} Mar 17 02:00:10 crc kubenswrapper[4735]: I0317 02:00:10.662269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerStarted","Data":"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0"} Mar 17 02:00:10 crc kubenswrapper[4735]: I0317 02:00:10.680782 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7kxj" podStartSLOduration=3.16120618 podStartE2EDuration="7.680764799s" podCreationTimestamp="2026-03-17 02:00:03 +0000 UTC" firstStartedPulling="2026-03-17 02:00:05.585818632 +0000 UTC m=+3031.218051650" lastFinishedPulling="2026-03-17 02:00:10.105377291 +0000 UTC m=+3035.737610269" observedRunningTime="2026-03-17 02:00:10.678304052 +0000 UTC m=+3036.310537030" watchObservedRunningTime="2026-03-17 02:00:10.680764799 +0000 UTC m=+3036.312997797" Mar 17 02:00:11 crc kubenswrapper[4735]: I0317 02:00:11.262467 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:11 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:11 crc kubenswrapper[4735]: > Mar 17 02:00:13 crc kubenswrapper[4735]: I0317 02:00:13.921198 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:13 crc kubenswrapper[4735]: I0317 02:00:13.921685 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:13 crc kubenswrapper[4735]: I0317 02:00:13.973816 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:16 crc kubenswrapper[4735]: I0317 02:00:16.073012 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:00:16 crc kubenswrapper[4735]: E0317 02:00:16.073634 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:00:21 crc kubenswrapper[4735]: I0317 02:00:21.246110 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:21 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:21 crc kubenswrapper[4735]: > Mar 17 02:00:21 crc kubenswrapper[4735]: I0317 02:00:21.768157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-ml84n" event={"ID":"4ab217a3-57e5-4677-9aa2-b35bff4d3584","Type":"ContainerStarted","Data":"67c4ffdd722fb2eeb03ac84d09f8b3293b3c2bb8c5b6ceeb7cb209408513de56"} Mar 17 02:00:21 crc kubenswrapper[4735]: I0317 02:00:21.787631 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561880-ml84n" podStartSLOduration=2.078739144 podStartE2EDuration="21.787607916s" podCreationTimestamp="2026-03-17 02:00:00 +0000 UTC" firstStartedPulling="2026-03-17 02:00:01.359838452 +0000 UTC m=+3026.992071430" lastFinishedPulling="2026-03-17 02:00:21.068707214 +0000 UTC m=+3046.700940202" observedRunningTime="2026-03-17 02:00:21.786630594 +0000 UTC m=+3047.418863572" watchObservedRunningTime="2026-03-17 02:00:21.787607916 +0000 UTC m=+3047.419840924" Mar 17 02:00:23 crc kubenswrapper[4735]: I0317 02:00:23.783221 4735 generic.go:334] "Generic (PLEG): container finished" podID="4ab217a3-57e5-4677-9aa2-b35bff4d3584" containerID="67c4ffdd722fb2eeb03ac84d09f8b3293b3c2bb8c5b6ceeb7cb209408513de56" exitCode=0 Mar 17 02:00:23 crc kubenswrapper[4735]: I0317 02:00:23.783303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-ml84n" event={"ID":"4ab217a3-57e5-4677-9aa2-b35bff4d3584","Type":"ContainerDied","Data":"67c4ffdd722fb2eeb03ac84d09f8b3293b3c2bb8c5b6ceeb7cb209408513de56"} Mar 17 02:00:23 crc kubenswrapper[4735]: I0317 02:00:23.981952 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:24 crc kubenswrapper[4735]: I0317 02:00:24.049085 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:24 crc kubenswrapper[4735]: I0317 02:00:24.792528 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7kxj" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="registry-server" containerID="cri-o://29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0" gracePeriod=2 Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.497800 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.505408 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.666913 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncmst\" (UniqueName: \"kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst\") pod \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.667021 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content\") pod \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.667049 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities\") pod \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\" (UID: \"cc12460c-4d11-4d4e-8ad4-1425e72a4866\") " Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.667203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdd59\" (UniqueName: \"kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59\") pod \"4ab217a3-57e5-4677-9aa2-b35bff4d3584\" (UID: \"4ab217a3-57e5-4677-9aa2-b35bff4d3584\") " Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.668698 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities" (OuterVolumeSpecName: "utilities") pod "cc12460c-4d11-4d4e-8ad4-1425e72a4866" (UID: "cc12460c-4d11-4d4e-8ad4-1425e72a4866"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.695478 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst" (OuterVolumeSpecName: "kube-api-access-ncmst") pod "cc12460c-4d11-4d4e-8ad4-1425e72a4866" (UID: "cc12460c-4d11-4d4e-8ad4-1425e72a4866"). InnerVolumeSpecName "kube-api-access-ncmst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.697090 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59" (OuterVolumeSpecName: "kube-api-access-mdd59") pod "4ab217a3-57e5-4677-9aa2-b35bff4d3584" (UID: "4ab217a3-57e5-4677-9aa2-b35bff4d3584"). InnerVolumeSpecName "kube-api-access-mdd59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.711814 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc12460c-4d11-4d4e-8ad4-1425e72a4866" (UID: "cc12460c-4d11-4d4e-8ad4-1425e72a4866"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.769183 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdd59\" (UniqueName: \"kubernetes.io/projected/4ab217a3-57e5-4677-9aa2-b35bff4d3584-kube-api-access-mdd59\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.769739 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncmst\" (UniqueName: \"kubernetes.io/projected/cc12460c-4d11-4d4e-8ad4-1425e72a4866-kube-api-access-ncmst\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.769762 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.769774 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc12460c-4d11-4d4e-8ad4-1425e72a4866-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.804201 4735 generic.go:334] "Generic (PLEG): container finished" podID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerID="29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0" exitCode=0 Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.804247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerDied","Data":"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0"} Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.804272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7kxj" event={"ID":"cc12460c-4d11-4d4e-8ad4-1425e72a4866","Type":"ContainerDied","Data":"cdfea41f92c62f5011e64890743e652fff19ba8d578f188d1f787e3177ed4b0e"} Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.804288 4735 scope.go:117] "RemoveContainer" containerID="29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.804384 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7kxj" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.819449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-ml84n" event={"ID":"4ab217a3-57e5-4677-9aa2-b35bff4d3584","Type":"ContainerDied","Data":"85b9dddb1f9f1b0c17ad1f06d7df474cb30f9b3fefcd488451cfa199db3f346e"} Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.819498 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b9dddb1f9f1b0c17ad1f06d7df474cb30f9b3fefcd488451cfa199db3f346e" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.819573 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-ml84n" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.863606 4735 scope.go:117] "RemoveContainer" containerID="3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.882560 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.892800 4735 scope.go:117] "RemoveContainer" containerID="14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.903918 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7kxj"] Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.910357 4735 scope.go:117] "RemoveContainer" containerID="29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0" Mar 17 02:00:25 crc kubenswrapper[4735]: E0317 02:00:25.910847 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0\": container with ID starting with 29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0 not found: ID does not exist" containerID="29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.910951 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0"} err="failed to get container status \"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0\": rpc error: code = NotFound desc = could not find container \"29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0\": container with ID starting with 29156c53664bd659a68870634845332a6d20fc3a21c99b03be295d05f869ebe0 not found: ID does not exist" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.910975 4735 scope.go:117] "RemoveContainer" containerID="3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d" Mar 17 02:00:25 crc kubenswrapper[4735]: E0317 02:00:25.911288 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d\": container with ID starting with 3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d not found: ID does not exist" containerID="3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.911315 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d"} err="failed to get container status \"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d\": rpc error: code = NotFound desc = could not find container \"3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d\": container with ID starting with 3c9e421a49330c825ab6771d62f61fbf0c87203d6ae01384441b5fec8a6a284d not found: ID does not exist" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.911329 4735 scope.go:117] "RemoveContainer" containerID="14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627" Mar 17 02:00:25 crc kubenswrapper[4735]: E0317 02:00:25.911732 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627\": container with ID starting with 14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627 not found: ID does not exist" containerID="14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.911765 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627"} err="failed to get container status \"14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627\": rpc error: code = NotFound desc = could not find container \"14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627\": container with ID starting with 14f9f7845a9f66a186691c89456ab3b76042188488ced13c85b6fab88527a627 not found: ID does not exist" Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.925576 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-58lx6"] Mar 17 02:00:25 crc kubenswrapper[4735]: I0317 02:00:25.945670 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-58lx6"] Mar 17 02:00:27 crc kubenswrapper[4735]: I0317 02:00:27.081885 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4100711d-cc79-4277-bc28-f8c6cc56bfb0" path="/var/lib/kubelet/pods/4100711d-cc79-4277-bc28-f8c6cc56bfb0/volumes" Mar 17 02:00:27 crc kubenswrapper[4735]: I0317 02:00:27.083440 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" path="/var/lib/kubelet/pods/cc12460c-4d11-4d4e-8ad4-1425e72a4866/volumes" Mar 17 02:00:28 crc kubenswrapper[4735]: I0317 02:00:28.072761 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:00:28 crc kubenswrapper[4735]: E0317 02:00:28.073297 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:00:31 crc kubenswrapper[4735]: I0317 02:00:31.244880 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:31 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:31 crc kubenswrapper[4735]: > Mar 17 02:00:39 crc kubenswrapper[4735]: I0317 02:00:39.073875 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:00:39 crc kubenswrapper[4735]: E0317 02:00:39.074590 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:00:41 crc kubenswrapper[4735]: I0317 02:00:41.250513 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:41 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:41 crc kubenswrapper[4735]: > Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.350405 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:00:50 crc kubenswrapper[4735]: E0317 02:00:50.362833 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab217a3-57e5-4677-9aa2-b35bff4d3584" containerName="oc" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.363349 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab217a3-57e5-4677-9aa2-b35bff4d3584" containerName="oc" Mar 17 02:00:50 crc kubenswrapper[4735]: E0317 02:00:50.363371 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="extract-utilities" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.363402 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="extract-utilities" Mar 17 02:00:50 crc kubenswrapper[4735]: E0317 02:00:50.363458 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="extract-content" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.363466 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="extract-content" Mar 17 02:00:50 crc kubenswrapper[4735]: E0317 02:00:50.363485 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da359745-5f32-4749-8c55-e19847bae917" containerName="collect-profiles" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.363492 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="da359745-5f32-4749-8c55-e19847bae917" containerName="collect-profiles" Mar 17 02:00:50 crc kubenswrapper[4735]: E0317 02:00:50.363499 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="registry-server" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.363505 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="registry-server" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.364947 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="da359745-5f32-4749-8c55-e19847bae917" containerName="collect-profiles" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.364973 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab217a3-57e5-4677-9aa2-b35bff4d3584" containerName="oc" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.364988 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc12460c-4d11-4d4e-8ad4-1425e72a4866" containerName="registry-server" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.371561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.411347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.411408 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zgs\" (UniqueName: \"kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.411439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.513411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.513759 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zgs\" (UniqueName: \"kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.513793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.518507 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.524533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.529593 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.568583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zgs\" (UniqueName: \"kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs\") pod \"community-operators-qhwnx\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:50 crc kubenswrapper[4735]: I0317 02:00:50.701673 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:00:51 crc kubenswrapper[4735]: I0317 02:00:51.260477 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:51 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:51 crc kubenswrapper[4735]: > Mar 17 02:00:52 crc kubenswrapper[4735]: I0317 02:00:52.049453 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:00:52 crc kubenswrapper[4735]: I0317 02:00:52.072998 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:00:52 crc kubenswrapper[4735]: E0317 02:00:52.073292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:00:52 crc kubenswrapper[4735]: W0317 02:00:52.073373 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f52161_44d4_4fa7_b031_13fa6d180acc.slice/crio-4bb2f8269b5bf1932a5034d86a11155f4d7d63bd503ee633d9f5c25fb0a986c5 WatchSource:0}: Error finding container 4bb2f8269b5bf1932a5034d86a11155f4d7d63bd503ee633d9f5c25fb0a986c5: Status 404 returned error can't find the container with id 4bb2f8269b5bf1932a5034d86a11155f4d7d63bd503ee633d9f5c25fb0a986c5 Mar 17 02:00:53 crc kubenswrapper[4735]: I0317 02:00:53.056158 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerID="879030dbf739cfd4cfe3bd03a0dda5eaac2eec6e0ac33c49667f7398b1812625" exitCode=0 Mar 17 02:00:53 crc kubenswrapper[4735]: I0317 02:00:53.056277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerDied","Data":"879030dbf739cfd4cfe3bd03a0dda5eaac2eec6e0ac33c49667f7398b1812625"} Mar 17 02:00:53 crc kubenswrapper[4735]: I0317 02:00:53.056740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerStarted","Data":"4bb2f8269b5bf1932a5034d86a11155f4d7d63bd503ee633d9f5c25fb0a986c5"} Mar 17 02:00:55 crc kubenswrapper[4735]: I0317 02:00:55.088431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerStarted","Data":"cfb704560f722dd6f65253e0103c805bf4144e24f481455b8c85ab9ff66cc2e4"} Mar 17 02:00:56 crc kubenswrapper[4735]: I0317 02:00:56.314154 4735 scope.go:117] "RemoveContainer" containerID="dfbbdfad9b6c33022c340310b793334bf0ebcb3c23323c15df1bafe91acbffc1" Mar 17 02:00:56 crc kubenswrapper[4735]: I0317 02:00:56.438659 4735 scope.go:117] "RemoveContainer" containerID="a6cdb0e4ab56afba5560d96b4748f241772752f8bd94631191773f1e31aa3838" Mar 17 02:00:57 crc kubenswrapper[4735]: I0317 02:00:57.092384 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerID="cfb704560f722dd6f65253e0103c805bf4144e24f481455b8c85ab9ff66cc2e4" exitCode=0 Mar 17 02:00:57 crc kubenswrapper[4735]: I0317 02:00:57.092461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerDied","Data":"cfb704560f722dd6f65253e0103c805bf4144e24f481455b8c85ab9ff66cc2e4"} Mar 17 02:00:58 crc kubenswrapper[4735]: I0317 02:00:58.105360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerStarted","Data":"be9d3bc553806ec5cdba206ad74cf00f860ac46a8056a65ddec67036191fdd19"} Mar 17 02:00:58 crc kubenswrapper[4735]: I0317 02:00:58.132718 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhwnx" podStartSLOduration=3.617018205 podStartE2EDuration="8.129276957s" podCreationTimestamp="2026-03-17 02:00:50 +0000 UTC" firstStartedPulling="2026-03-17 02:00:53.059622199 +0000 UTC m=+3078.691855187" lastFinishedPulling="2026-03-17 02:00:57.571880921 +0000 UTC m=+3083.204113939" observedRunningTime="2026-03-17 02:00:58.12195119 +0000 UTC m=+3083.754184178" watchObservedRunningTime="2026-03-17 02:00:58.129276957 +0000 UTC m=+3083.761509935" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.253957 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.355581 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29561881-lgzt4"] Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.360219 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.426435 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.426604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.426647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptqm\" (UniqueName: \"kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.426700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.463150 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561881-lgzt4"] Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.466094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.528604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.528904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptqm\" (UniqueName: \"kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.529044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.529237 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.549447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptqm\" (UniqueName: \"kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.550589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.551682 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.556471 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys\") pod \"keystone-cron-29561881-lgzt4\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.596750 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.679125 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.702258 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:00 crc kubenswrapper[4735]: I0317 02:01:00.702402 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:01 crc kubenswrapper[4735]: I0317 02:01:01.307520 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561881-lgzt4"] Mar 17 02:01:01 crc kubenswrapper[4735]: I0317 02:01:01.802496 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qhwnx" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" probeResult="failure" output=< Mar 17 02:01:01 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:01:01 crc kubenswrapper[4735]: > Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.141852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-lgzt4" event={"ID":"9576ffce-8ade-47e7-8f60-c928cb0116b5","Type":"ContainerStarted","Data":"189db3d276c7d7a3b997a9a9f3f56ae9cbf33d9aa35841e5200c0a711d9761d5"} Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.142346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-lgzt4" event={"ID":"9576ffce-8ade-47e7-8f60-c928cb0116b5","Type":"ContainerStarted","Data":"2323245c32399cf967d63086589ab85a7a191bbe77e30e0da2bc562ad7f94344"} Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.145127 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jqll6" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" containerID="cri-o://8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9" gracePeriod=2 Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.764178 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.790935 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29561881-lgzt4" podStartSLOduration=2.790917108 podStartE2EDuration="2.790917108s" podCreationTimestamp="2026-03-17 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:01:02.175502194 +0000 UTC m=+3087.807735172" watchObservedRunningTime="2026-03-17 02:01:02.790917108 +0000 UTC m=+3088.423150086" Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.871391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities\") pod \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.871518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7jg\" (UniqueName: \"kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg\") pod \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.871738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content\") pod \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\" (UID: \"a21605e8-f4e2-43f4-99b1-d0d4958ac72c\") " Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.875131 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities" (OuterVolumeSpecName: "utilities") pod "a21605e8-f4e2-43f4-99b1-d0d4958ac72c" (UID: "a21605e8-f4e2-43f4-99b1-d0d4958ac72c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.881084 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg" (OuterVolumeSpecName: "kube-api-access-hz7jg") pod "a21605e8-f4e2-43f4-99b1-d0d4958ac72c" (UID: "a21605e8-f4e2-43f4-99b1-d0d4958ac72c"). InnerVolumeSpecName "kube-api-access-hz7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.973725 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:02 crc kubenswrapper[4735]: I0317 02:01:02.973755 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7jg\" (UniqueName: \"kubernetes.io/projected/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-kube-api-access-hz7jg\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.025677 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21605e8-f4e2-43f4-99b1-d0d4958ac72c" (UID: "a21605e8-f4e2-43f4-99b1-d0d4958ac72c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.075549 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21605e8-f4e2-43f4-99b1-d0d4958ac72c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.153134 4735 generic.go:334] "Generic (PLEG): container finished" podID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerID="8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9" exitCode=0 Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.153399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerDied","Data":"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9"} Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.153443 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqll6" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.153459 4735 scope.go:117] "RemoveContainer" containerID="8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.153447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqll6" event={"ID":"a21605e8-f4e2-43f4-99b1-d0d4958ac72c","Type":"ContainerDied","Data":"a6de1d902252c69ed87bb7eef65f97d096d2f445bce4a91632de4e703f43ac05"} Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.174771 4735 scope.go:117] "RemoveContainer" containerID="70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.181538 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.193572 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jqll6"] Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.216040 4735 scope.go:117] "RemoveContainer" containerID="f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.298447 4735 scope.go:117] "RemoveContainer" containerID="8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9" Mar 17 02:01:03 crc kubenswrapper[4735]: E0317 02:01:03.306963 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9\": container with ID starting with 8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9 not found: ID does not exist" containerID="8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.307060 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9"} err="failed to get container status \"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9\": rpc error: code = NotFound desc = could not find container \"8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9\": container with ID starting with 8f85f80e7133f205681f0e4690372d1914ac5f59044fd3df39807bb45e2d5af9 not found: ID does not exist" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.307099 4735 scope.go:117] "RemoveContainer" containerID="70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c" Mar 17 02:01:03 crc kubenswrapper[4735]: E0317 02:01:03.308427 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c\": container with ID starting with 70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c not found: ID does not exist" containerID="70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.308462 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c"} err="failed to get container status \"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c\": rpc error: code = NotFound desc = could not find container \"70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c\": container with ID starting with 70199e592f553cff2b5f9020d664dead18d678f44efdca83e824fceb1853522c not found: ID does not exist" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.308484 4735 scope.go:117] "RemoveContainer" containerID="f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd" Mar 17 02:01:03 crc kubenswrapper[4735]: E0317 02:01:03.309527 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd\": container with ID starting with f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd not found: ID does not exist" containerID="f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd" Mar 17 02:01:03 crc kubenswrapper[4735]: I0317 02:01:03.309569 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd"} err="failed to get container status \"f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd\": rpc error: code = NotFound desc = could not find container \"f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd\": container with ID starting with f0462b166cd153b8df47ba3682ff2943dd572a3c40296ac3653fbd4698f46ddd not found: ID does not exist" Mar 17 02:01:04 crc kubenswrapper[4735]: I0317 02:01:04.074014 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:01:04 crc kubenswrapper[4735]: E0317 02:01:04.074423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:01:05 crc kubenswrapper[4735]: I0317 02:01:05.086546 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" path="/var/lib/kubelet/pods/a21605e8-f4e2-43f4-99b1-d0d4958ac72c/volumes" Mar 17 02:01:06 crc kubenswrapper[4735]: I0317 02:01:06.177963 4735 generic.go:334] "Generic (PLEG): container finished" podID="9576ffce-8ade-47e7-8f60-c928cb0116b5" containerID="189db3d276c7d7a3b997a9a9f3f56ae9cbf33d9aa35841e5200c0a711d9761d5" exitCode=0 Mar 17 02:01:06 crc kubenswrapper[4735]: I0317 02:01:06.178022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-lgzt4" event={"ID":"9576ffce-8ade-47e7-8f60-c928cb0116b5","Type":"ContainerDied","Data":"189db3d276c7d7a3b997a9a9f3f56ae9cbf33d9aa35841e5200c0a711d9761d5"} Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.894357 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.973598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys\") pod \"9576ffce-8ade-47e7-8f60-c928cb0116b5\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.973711 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptqm\" (UniqueName: \"kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm\") pod \"9576ffce-8ade-47e7-8f60-c928cb0116b5\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.973757 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle\") pod \"9576ffce-8ade-47e7-8f60-c928cb0116b5\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.973771 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data\") pod \"9576ffce-8ade-47e7-8f60-c928cb0116b5\" (UID: \"9576ffce-8ade-47e7-8f60-c928cb0116b5\") " Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.997697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm" (OuterVolumeSpecName: "kube-api-access-fptqm") pod "9576ffce-8ade-47e7-8f60-c928cb0116b5" (UID: "9576ffce-8ade-47e7-8f60-c928cb0116b5"). InnerVolumeSpecName "kube-api-access-fptqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:01:07 crc kubenswrapper[4735]: I0317 02:01:07.998319 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9576ffce-8ade-47e7-8f60-c928cb0116b5" (UID: "9576ffce-8ade-47e7-8f60-c928cb0116b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.025100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9576ffce-8ade-47e7-8f60-c928cb0116b5" (UID: "9576ffce-8ade-47e7-8f60-c928cb0116b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.046996 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data" (OuterVolumeSpecName: "config-data") pod "9576ffce-8ade-47e7-8f60-c928cb0116b5" (UID: "9576ffce-8ade-47e7-8f60-c928cb0116b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.075266 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.075303 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptqm\" (UniqueName: \"kubernetes.io/projected/9576ffce-8ade-47e7-8f60-c928cb0116b5-kube-api-access-fptqm\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.075318 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.075329 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9576ffce-8ade-47e7-8f60-c928cb0116b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.197133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-lgzt4" event={"ID":"9576ffce-8ade-47e7-8f60-c928cb0116b5","Type":"ContainerDied","Data":"2323245c32399cf967d63086589ab85a7a191bbe77e30e0da2bc562ad7f94344"} Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.197173 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2323245c32399cf967d63086589ab85a7a191bbe77e30e0da2bc562ad7f94344" Mar 17 02:01:08 crc kubenswrapper[4735]: I0317 02:01:08.197237 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-lgzt4" Mar 17 02:01:11 crc kubenswrapper[4735]: I0317 02:01:11.755636 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qhwnx" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" probeResult="failure" output=< Mar 17 02:01:11 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:01:11 crc kubenswrapper[4735]: > Mar 17 02:01:19 crc kubenswrapper[4735]: I0317 02:01:19.073176 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:01:19 crc kubenswrapper[4735]: E0317 02:01:19.073923 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:01:20 crc kubenswrapper[4735]: I0317 02:01:20.795470 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:20 crc kubenswrapper[4735]: I0317 02:01:20.846500 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:21 crc kubenswrapper[4735]: I0317 02:01:21.429326 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:01:22 crc kubenswrapper[4735]: I0317 02:01:22.466069 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qhwnx" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" containerID="cri-o://be9d3bc553806ec5cdba206ad74cf00f860ac46a8056a65ddec67036191fdd19" gracePeriod=2 Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.472082 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerID="be9d3bc553806ec5cdba206ad74cf00f860ac46a8056a65ddec67036191fdd19" exitCode=0 Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.472177 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerDied","Data":"be9d3bc553806ec5cdba206ad74cf00f860ac46a8056a65ddec67036191fdd19"} Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.618986 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.809881 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities\") pod \"f2f52161-44d4-4fa7-b031-13fa6d180acc\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.810206 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content\") pod \"f2f52161-44d4-4fa7-b031-13fa6d180acc\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.810237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6zgs\" (UniqueName: \"kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs\") pod \"f2f52161-44d4-4fa7-b031-13fa6d180acc\" (UID: \"f2f52161-44d4-4fa7-b031-13fa6d180acc\") " Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.813659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities" (OuterVolumeSpecName: "utilities") pod "f2f52161-44d4-4fa7-b031-13fa6d180acc" (UID: "f2f52161-44d4-4fa7-b031-13fa6d180acc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.833422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs" (OuterVolumeSpecName: "kube-api-access-z6zgs") pod "f2f52161-44d4-4fa7-b031-13fa6d180acc" (UID: "f2f52161-44d4-4fa7-b031-13fa6d180acc"). InnerVolumeSpecName "kube-api-access-z6zgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.912730 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6zgs\" (UniqueName: \"kubernetes.io/projected/f2f52161-44d4-4fa7-b031-13fa6d180acc-kube-api-access-z6zgs\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:23 crc kubenswrapper[4735]: I0317 02:01:23.912761 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.032556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2f52161-44d4-4fa7-b031-13fa6d180acc" (UID: "f2f52161-44d4-4fa7-b031-13fa6d180acc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.120981 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f52161-44d4-4fa7-b031-13fa6d180acc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.485777 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhwnx" event={"ID":"f2f52161-44d4-4fa7-b031-13fa6d180acc","Type":"ContainerDied","Data":"4bb2f8269b5bf1932a5034d86a11155f4d7d63bd503ee633d9f5c25fb0a986c5"} Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.485829 4735 scope.go:117] "RemoveContainer" containerID="be9d3bc553806ec5cdba206ad74cf00f860ac46a8056a65ddec67036191fdd19" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.485935 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhwnx" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.523068 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.524491 4735 scope.go:117] "RemoveContainer" containerID="cfb704560f722dd6f65253e0103c805bf4144e24f481455b8c85ab9ff66cc2e4" Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.537838 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qhwnx"] Mar 17 02:01:24 crc kubenswrapper[4735]: I0317 02:01:24.556995 4735 scope.go:117] "RemoveContainer" containerID="879030dbf739cfd4cfe3bd03a0dda5eaac2eec6e0ac33c49667f7398b1812625" Mar 17 02:01:25 crc kubenswrapper[4735]: I0317 02:01:25.083388 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" path="/var/lib/kubelet/pods/f2f52161-44d4-4fa7-b031-13fa6d180acc/volumes" Mar 17 02:01:30 crc kubenswrapper[4735]: I0317 02:01:30.072990 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:01:30 crc kubenswrapper[4735]: E0317 02:01:30.073689 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:01:43 crc kubenswrapper[4735]: I0317 02:01:43.073608 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:01:43 crc kubenswrapper[4735]: E0317 02:01:43.074400 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:01:58 crc kubenswrapper[4735]: I0317 02:01:58.073880 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:01:58 crc kubenswrapper[4735]: E0317 02:01:58.074791 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.592142 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561882-6zkg9"] Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598368 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9576ffce-8ade-47e7-8f60-c928cb0116b5" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598402 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9576ffce-8ade-47e7-8f60-c928cb0116b5" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598439 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="extract-content" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598446 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="extract-content" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598457 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="extract-utilities" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598464 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="extract-utilities" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598483 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="extract-content" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598488 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="extract-content" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598501 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598508 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598519 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598533 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: E0317 02:02:00.598543 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="extract-utilities" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.598549 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="extract-utilities" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.602479 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9576ffce-8ade-47e7-8f60-c928cb0116b5" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.602521 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21605e8-f4e2-43f4-99b1-d0d4958ac72c" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.602548 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f52161-44d4-4fa7-b031-13fa6d180acc" containerName="registry-server" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.605646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.615717 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.615717 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.615725 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.673215 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-6zkg9"] Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.692657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhlqc\" (UniqueName: \"kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc\") pod \"auto-csr-approver-29561882-6zkg9\" (UID: \"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15\") " pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.794576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhlqc\" (UniqueName: \"kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc\") pod \"auto-csr-approver-29561882-6zkg9\" (UID: \"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15\") " pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.838893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhlqc\" (UniqueName: \"kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc\") pod \"auto-csr-approver-29561882-6zkg9\" (UID: \"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15\") " pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:00 crc kubenswrapper[4735]: I0317 02:02:00.939780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:02 crc kubenswrapper[4735]: I0317 02:02:02.008335 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-6zkg9"] Mar 17 02:02:02 crc kubenswrapper[4735]: I0317 02:02:02.843438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" event={"ID":"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15","Type":"ContainerStarted","Data":"962d50d03f57ea1d20c4347a1c138746668ba075684a9f59c41cb00dd23f53ff"} Mar 17 02:02:04 crc kubenswrapper[4735]: I0317 02:02:04.865753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" event={"ID":"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15","Type":"ContainerStarted","Data":"7764a271913fd8c2fe136730d1067f0bb4ebced129ba62fbfac1151fa1e1002d"} Mar 17 02:02:04 crc kubenswrapper[4735]: I0317 02:02:04.891548 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" podStartSLOduration=3.74418009 podStartE2EDuration="4.889189513s" podCreationTimestamp="2026-03-17 02:02:00 +0000 UTC" firstStartedPulling="2026-03-17 02:02:02.041938695 +0000 UTC m=+3147.674171673" lastFinishedPulling="2026-03-17 02:02:03.186948078 +0000 UTC m=+3148.819181096" observedRunningTime="2026-03-17 02:02:04.881524898 +0000 UTC m=+3150.513757916" watchObservedRunningTime="2026-03-17 02:02:04.889189513 +0000 UTC m=+3150.521422511" Mar 17 02:02:05 crc kubenswrapper[4735]: I0317 02:02:05.876813 4735 generic.go:334] "Generic (PLEG): container finished" podID="65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" containerID="7764a271913fd8c2fe136730d1067f0bb4ebced129ba62fbfac1151fa1e1002d" exitCode=0 Mar 17 02:02:05 crc kubenswrapper[4735]: I0317 02:02:05.877003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" event={"ID":"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15","Type":"ContainerDied","Data":"7764a271913fd8c2fe136730d1067f0bb4ebced129ba62fbfac1151fa1e1002d"} Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.449062 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.464586 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhlqc\" (UniqueName: \"kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc\") pod \"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15\" (UID: \"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15\") " Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.472203 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc" (OuterVolumeSpecName: "kube-api-access-jhlqc") pod "65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" (UID: "65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15"). InnerVolumeSpecName "kube-api-access-jhlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.566494 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhlqc\" (UniqueName: \"kubernetes.io/projected/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15-kube-api-access-jhlqc\") on node \"crc\" DevicePath \"\"" Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.900223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" event={"ID":"65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15","Type":"ContainerDied","Data":"962d50d03f57ea1d20c4347a1c138746668ba075684a9f59c41cb00dd23f53ff"} Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.900308 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-6zkg9" Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.900912 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962d50d03f57ea1d20c4347a1c138746668ba075684a9f59c41cb00dd23f53ff" Mar 17 02:02:07 crc kubenswrapper[4735]: I0317 02:02:07.992416 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-z6872"] Mar 17 02:02:08 crc kubenswrapper[4735]: I0317 02:02:08.002726 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-z6872"] Mar 17 02:02:09 crc kubenswrapper[4735]: I0317 02:02:09.089658 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1313a60f-d73a-4de6-8750-8f2563aba379" path="/var/lib/kubelet/pods/1313a60f-d73a-4de6-8750-8f2563aba379/volumes" Mar 17 02:02:11 crc kubenswrapper[4735]: I0317 02:02:11.074089 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:02:11 crc kubenswrapper[4735]: E0317 02:02:11.076428 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:02:25 crc kubenswrapper[4735]: I0317 02:02:25.083221 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:02:25 crc kubenswrapper[4735]: E0317 02:02:25.085062 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:02:40 crc kubenswrapper[4735]: I0317 02:02:40.073770 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:02:40 crc kubenswrapper[4735]: E0317 02:02:40.074666 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:02:55 crc kubenswrapper[4735]: I0317 02:02:55.081512 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:02:55 crc kubenswrapper[4735]: E0317 02:02:55.082411 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:02:56 crc kubenswrapper[4735]: I0317 02:02:56.775668 4735 scope.go:117] "RemoveContainer" containerID="a594d9cc36e0b84ca23100eccd08f746276944a140a785d75701be9fe908fb42" Mar 17 02:03:08 crc kubenswrapper[4735]: I0317 02:03:08.074160 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:03:08 crc kubenswrapper[4735]: E0317 02:03:08.074961 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:03:23 crc kubenswrapper[4735]: I0317 02:03:23.074684 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:03:23 crc kubenswrapper[4735]: E0317 02:03:23.076672 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:03:35 crc kubenswrapper[4735]: I0317 02:03:35.086583 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:03:35 crc kubenswrapper[4735]: E0317 02:03:35.090054 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:03:46 crc kubenswrapper[4735]: I0317 02:03:46.074296 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:03:46 crc kubenswrapper[4735]: E0317 02:03:46.075196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:03:58 crc kubenswrapper[4735]: I0317 02:03:58.073989 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:03:58 crc kubenswrapper[4735]: E0317 02:03:58.075172 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.351246 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561884-hlvkq"] Mar 17 02:04:00 crc kubenswrapper[4735]: E0317 02:04:00.356616 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.356669 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.357159 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.359473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.369402 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.370378 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.370402 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.430077 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-hlvkq"] Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.437522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx6nc\" (UniqueName: \"kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc\") pod \"auto-csr-approver-29561884-hlvkq\" (UID: \"8438b716-2caa-4b91-90d3-ecfd2d9804fb\") " pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.538834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx6nc\" (UniqueName: \"kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc\") pod \"auto-csr-approver-29561884-hlvkq\" (UID: \"8438b716-2caa-4b91-90d3-ecfd2d9804fb\") " pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.587086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx6nc\" (UniqueName: \"kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc\") pod \"auto-csr-approver-29561884-hlvkq\" (UID: \"8438b716-2caa-4b91-90d3-ecfd2d9804fb\") " pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:00 crc kubenswrapper[4735]: I0317 02:04:00.684834 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:02 crc kubenswrapper[4735]: I0317 02:04:02.070608 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-hlvkq"] Mar 17 02:04:02 crc kubenswrapper[4735]: I0317 02:04:02.955019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" event={"ID":"8438b716-2caa-4b91-90d3-ecfd2d9804fb","Type":"ContainerStarted","Data":"9717aef6a147d89870058fa2ad6872123620cdf2ce2a3c68ea2a29db6c6bd38f"} Mar 17 02:04:03 crc kubenswrapper[4735]: I0317 02:04:03.964781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" event={"ID":"8438b716-2caa-4b91-90d3-ecfd2d9804fb","Type":"ContainerStarted","Data":"dc58a5c67fcb34b4b07539cefa66c8e6f8e1a857498fbda5851144890d8e292f"} Mar 17 02:04:03 crc kubenswrapper[4735]: I0317 02:04:03.989071 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" podStartSLOduration=3.063669411 podStartE2EDuration="3.987886761s" podCreationTimestamp="2026-03-17 02:04:00 +0000 UTC" firstStartedPulling="2026-03-17 02:04:02.095997626 +0000 UTC m=+3267.728230604" lastFinishedPulling="2026-03-17 02:04:03.020214976 +0000 UTC m=+3268.652447954" observedRunningTime="2026-03-17 02:04:03.980312578 +0000 UTC m=+3269.612545576" watchObservedRunningTime="2026-03-17 02:04:03.987886761 +0000 UTC m=+3269.620119759" Mar 17 02:04:05 crc kubenswrapper[4735]: I0317 02:04:05.980823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" event={"ID":"8438b716-2caa-4b91-90d3-ecfd2d9804fb","Type":"ContainerDied","Data":"dc58a5c67fcb34b4b07539cefa66c8e6f8e1a857498fbda5851144890d8e292f"} Mar 17 02:04:05 crc kubenswrapper[4735]: I0317 02:04:05.981689 4735 generic.go:334] "Generic (PLEG): container finished" podID="8438b716-2caa-4b91-90d3-ecfd2d9804fb" containerID="dc58a5c67fcb34b4b07539cefa66c8e6f8e1a857498fbda5851144890d8e292f" exitCode=0 Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.520123 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.628424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx6nc\" (UniqueName: \"kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc\") pod \"8438b716-2caa-4b91-90d3-ecfd2d9804fb\" (UID: \"8438b716-2caa-4b91-90d3-ecfd2d9804fb\") " Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.644104 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc" (OuterVolumeSpecName: "kube-api-access-sx6nc") pod "8438b716-2caa-4b91-90d3-ecfd2d9804fb" (UID: "8438b716-2caa-4b91-90d3-ecfd2d9804fb"). InnerVolumeSpecName "kube-api-access-sx6nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.730605 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx6nc\" (UniqueName: \"kubernetes.io/projected/8438b716-2caa-4b91-90d3-ecfd2d9804fb-kube-api-access-sx6nc\") on node \"crc\" DevicePath \"\"" Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.998038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" event={"ID":"8438b716-2caa-4b91-90d3-ecfd2d9804fb","Type":"ContainerDied","Data":"9717aef6a147d89870058fa2ad6872123620cdf2ce2a3c68ea2a29db6c6bd38f"} Mar 17 02:04:07 crc kubenswrapper[4735]: I0317 02:04:07.998497 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-hlvkq" Mar 17 02:04:08 crc kubenswrapper[4735]: I0317 02:04:08.002880 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9717aef6a147d89870058fa2ad6872123620cdf2ce2a3c68ea2a29db6c6bd38f" Mar 17 02:04:08 crc kubenswrapper[4735]: I0317 02:04:08.095740 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-psf9k"] Mar 17 02:04:08 crc kubenswrapper[4735]: I0317 02:04:08.105406 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-psf9k"] Mar 17 02:04:09 crc kubenswrapper[4735]: I0317 02:04:09.083329 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692e77e5-b3c6-4605-b9c1-3e0ad803a90c" path="/var/lib/kubelet/pods/692e77e5-b3c6-4605-b9c1-3e0ad803a90c/volumes" Mar 17 02:04:11 crc kubenswrapper[4735]: I0317 02:04:11.074301 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:04:11 crc kubenswrapper[4735]: E0317 02:04:11.075260 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:04:26 crc kubenswrapper[4735]: I0317 02:04:26.085279 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:04:26 crc kubenswrapper[4735]: I0317 02:04:26.743347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9"} Mar 17 02:04:57 crc kubenswrapper[4735]: I0317 02:04:57.035199 4735 scope.go:117] "RemoveContainer" containerID="5dae8c5f457463687011df4b32e20ebe770b4aaee1ea0e654fbfdb1d68a03333" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.279715 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561886-jhvfb"] Mar 17 02:06:00 crc kubenswrapper[4735]: E0317 02:06:00.284192 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8438b716-2caa-4b91-90d3-ecfd2d9804fb" containerName="oc" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.284220 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8438b716-2caa-4b91-90d3-ecfd2d9804fb" containerName="oc" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.286029 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8438b716-2caa-4b91-90d3-ecfd2d9804fb" containerName="oc" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.293612 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.302768 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.302778 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.302778 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.381300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmdr\" (UniqueName: \"kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr\") pod \"auto-csr-approver-29561886-jhvfb\" (UID: \"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00\") " pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.382732 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-jhvfb"] Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.482919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmdr\" (UniqueName: \"kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr\") pod \"auto-csr-approver-29561886-jhvfb\" (UID: \"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00\") " pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.525377 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmdr\" (UniqueName: \"kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr\") pod \"auto-csr-approver-29561886-jhvfb\" (UID: \"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00\") " pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:00 crc kubenswrapper[4735]: I0317 02:06:00.633636 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:01 crc kubenswrapper[4735]: I0317 02:06:01.992210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-jhvfb"] Mar 17 02:06:02 crc kubenswrapper[4735]: W0317 02:06:02.033574 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd621d6d4_2334_43f4_8f1b_b3f74ca9dd00.slice/crio-93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f WatchSource:0}: Error finding container 93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f: Status 404 returned error can't find the container with id 93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f Mar 17 02:06:02 crc kubenswrapper[4735]: I0317 02:06:02.055837 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:06:02 crc kubenswrapper[4735]: I0317 02:06:02.983148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" event={"ID":"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00","Type":"ContainerStarted","Data":"93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f"} Mar 17 02:06:05 crc kubenswrapper[4735]: I0317 02:06:05.005725 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" event={"ID":"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00","Type":"ContainerStarted","Data":"b245f5bed36bf6117ed46cedcef2a8749c6012f09b174636f3460266f83b23cc"} Mar 17 02:06:05 crc kubenswrapper[4735]: I0317 02:06:05.036718 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" podStartSLOduration=3.908355427 podStartE2EDuration="5.031815306s" podCreationTimestamp="2026-03-17 02:06:00 +0000 UTC" firstStartedPulling="2026-03-17 02:06:02.048564355 +0000 UTC m=+3387.680797343" lastFinishedPulling="2026-03-17 02:06:03.172024244 +0000 UTC m=+3388.804257222" observedRunningTime="2026-03-17 02:06:05.027853545 +0000 UTC m=+3390.660086533" watchObservedRunningTime="2026-03-17 02:06:05.031815306 +0000 UTC m=+3390.664048294" Mar 17 02:06:06 crc kubenswrapper[4735]: I0317 02:06:06.022686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" event={"ID":"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00","Type":"ContainerDied","Data":"b245f5bed36bf6117ed46cedcef2a8749c6012f09b174636f3460266f83b23cc"} Mar 17 02:06:06 crc kubenswrapper[4735]: I0317 02:06:06.022301 4735 generic.go:334] "Generic (PLEG): container finished" podID="d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" containerID="b245f5bed36bf6117ed46cedcef2a8749c6012f09b174636f3460266f83b23cc" exitCode=0 Mar 17 02:06:07 crc kubenswrapper[4735]: I0317 02:06:07.570266 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:07 crc kubenswrapper[4735]: I0317 02:06:07.628342 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmdr\" (UniqueName: \"kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr\") pod \"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00\" (UID: \"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00\") " Mar 17 02:06:07 crc kubenswrapper[4735]: I0317 02:06:07.647364 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr" (OuterVolumeSpecName: "kube-api-access-wpmdr") pod "d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" (UID: "d621d6d4-2334-43f4-8f1b-b3f74ca9dd00"). InnerVolumeSpecName "kube-api-access-wpmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:06:07 crc kubenswrapper[4735]: I0317 02:06:07.731957 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmdr\" (UniqueName: \"kubernetes.io/projected/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00-kube-api-access-wpmdr\") on node \"crc\" DevicePath \"\"" Mar 17 02:06:08 crc kubenswrapper[4735]: I0317 02:06:08.047082 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" Mar 17 02:06:08 crc kubenswrapper[4735]: I0317 02:06:08.047075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-jhvfb" event={"ID":"d621d6d4-2334-43f4-8f1b-b3f74ca9dd00","Type":"ContainerDied","Data":"93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f"} Mar 17 02:06:08 crc kubenswrapper[4735]: I0317 02:06:08.048280 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93969a361b088b7f8041b185b812e8f3f238272e811634654c530bd088d49d2f" Mar 17 02:06:08 crc kubenswrapper[4735]: I0317 02:06:08.146576 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-ml84n"] Mar 17 02:06:08 crc kubenswrapper[4735]: I0317 02:06:08.155910 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-ml84n"] Mar 17 02:06:09 crc kubenswrapper[4735]: I0317 02:06:09.084544 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab217a3-57e5-4677-9aa2-b35bff4d3584" path="/var/lib/kubelet/pods/4ab217a3-57e5-4677-9aa2-b35bff4d3584/volumes" Mar 17 02:06:42 crc kubenswrapper[4735]: I0317 02:06:42.606278 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:06:42 crc kubenswrapper[4735]: I0317 02:06:42.606749 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:06:57 crc kubenswrapper[4735]: I0317 02:06:57.246662 4735 scope.go:117] "RemoveContainer" containerID="67c4ffdd722fb2eeb03ac84d09f8b3293b3c2bb8c5b6ceeb7cb209408513de56" Mar 17 02:07:12 crc kubenswrapper[4735]: I0317 02:07:12.606679 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:07:12 crc kubenswrapper[4735]: I0317 02:07:12.607495 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:07:14 crc kubenswrapper[4735]: I0317 02:07:14.354112 4735 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mk2xg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:07:14 crc kubenswrapper[4735]: I0317 02:07:14.354161 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk2xg" podUID="687efab0-3f01-453e-b202-46d47422c46d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 02:07:42 crc kubenswrapper[4735]: I0317 02:07:42.606127 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:07:42 crc kubenswrapper[4735]: I0317 02:07:42.606702 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:07:42 crc kubenswrapper[4735]: I0317 02:07:42.606752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:07:42 crc kubenswrapper[4735]: I0317 02:07:42.608917 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:07:42 crc kubenswrapper[4735]: I0317 02:07:42.609787 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9" gracePeriod=600 Mar 17 02:07:43 crc kubenswrapper[4735]: I0317 02:07:43.670982 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9"} Mar 17 02:07:43 crc kubenswrapper[4735]: I0317 02:07:43.671471 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9" exitCode=0 Mar 17 02:07:43 crc kubenswrapper[4735]: I0317 02:07:43.673517 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10"} Mar 17 02:07:43 crc kubenswrapper[4735]: I0317 02:07:43.673566 4735 scope.go:117] "RemoveContainer" containerID="986d9192e763bf22afe86471b28b8637e8d05d909cfaea3707d20e7914ac80cb" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.357983 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561888-scvlj"] Mar 17 02:08:00 crc kubenswrapper[4735]: E0317 02:08:00.366056 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.366245 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.370031 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.381003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.407151 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.407176 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.423031 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.478674 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-scvlj"] Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.522484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxccl\" (UniqueName: \"kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl\") pod \"auto-csr-approver-29561888-scvlj\" (UID: \"311a35f1-29f3-4bf4-a39a-4f642c284101\") " pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.624728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxccl\" (UniqueName: \"kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl\") pod \"auto-csr-approver-29561888-scvlj\" (UID: \"311a35f1-29f3-4bf4-a39a-4f642c284101\") " pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.661360 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxccl\" (UniqueName: \"kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl\") pod \"auto-csr-approver-29561888-scvlj\" (UID: \"311a35f1-29f3-4bf4-a39a-4f642c284101\") " pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:00 crc kubenswrapper[4735]: I0317 02:08:00.726804 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:02 crc kubenswrapper[4735]: I0317 02:08:02.082393 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-scvlj"] Mar 17 02:08:02 crc kubenswrapper[4735]: W0317 02:08:02.121262 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311a35f1_29f3_4bf4_a39a_4f642c284101.slice/crio-a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36 WatchSource:0}: Error finding container a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36: Status 404 returned error can't find the container with id a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36 Mar 17 02:08:02 crc kubenswrapper[4735]: I0317 02:08:02.842749 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-scvlj" event={"ID":"311a35f1-29f3-4bf4-a39a-4f642c284101","Type":"ContainerStarted","Data":"a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36"} Mar 17 02:08:04 crc kubenswrapper[4735]: I0317 02:08:04.860385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-scvlj" event={"ID":"311a35f1-29f3-4bf4-a39a-4f642c284101","Type":"ContainerStarted","Data":"d359be0e1c53ed68c07664b05b23c3a184d4b1ee71f64868c751311ab2845205"} Mar 17 02:08:04 crc kubenswrapper[4735]: I0317 02:08:04.923092 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561888-scvlj" podStartSLOduration=3.920968034 podStartE2EDuration="4.921711195s" podCreationTimestamp="2026-03-17 02:08:00 +0000 UTC" firstStartedPulling="2026-03-17 02:08:02.128387626 +0000 UTC m=+3507.760620604" lastFinishedPulling="2026-03-17 02:08:03.129130787 +0000 UTC m=+3508.761363765" observedRunningTime="2026-03-17 02:08:04.907764204 +0000 UTC m=+3510.539997192" watchObservedRunningTime="2026-03-17 02:08:04.921711195 +0000 UTC m=+3510.553944173" Mar 17 02:08:05 crc kubenswrapper[4735]: I0317 02:08:05.869968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-scvlj" event={"ID":"311a35f1-29f3-4bf4-a39a-4f642c284101","Type":"ContainerDied","Data":"d359be0e1c53ed68c07664b05b23c3a184d4b1ee71f64868c751311ab2845205"} Mar 17 02:08:05 crc kubenswrapper[4735]: I0317 02:08:05.871212 4735 generic.go:334] "Generic (PLEG): container finished" podID="311a35f1-29f3-4bf4-a39a-4f642c284101" containerID="d359be0e1c53ed68c07664b05b23c3a184d4b1ee71f64868c751311ab2845205" exitCode=0 Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.482299 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.659136 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxccl\" (UniqueName: \"kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl\") pod \"311a35f1-29f3-4bf4-a39a-4f642c284101\" (UID: \"311a35f1-29f3-4bf4-a39a-4f642c284101\") " Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.674265 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl" (OuterVolumeSpecName: "kube-api-access-gxccl") pod "311a35f1-29f3-4bf4-a39a-4f642c284101" (UID: "311a35f1-29f3-4bf4-a39a-4f642c284101"). InnerVolumeSpecName "kube-api-access-gxccl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.760866 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxccl\" (UniqueName: \"kubernetes.io/projected/311a35f1-29f3-4bf4-a39a-4f642c284101-kube-api-access-gxccl\") on node \"crc\" DevicePath \"\"" Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.903020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-scvlj" event={"ID":"311a35f1-29f3-4bf4-a39a-4f642c284101","Type":"ContainerDied","Data":"a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36"} Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.903260 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-scvlj" Mar 17 02:08:07 crc kubenswrapper[4735]: I0317 02:08:07.904283 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89c5a447be4fe24419c65df3846c149bde7f587bbcf161477170089af553a36" Mar 17 02:08:08 crc kubenswrapper[4735]: I0317 02:08:08.656702 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-6zkg9"] Mar 17 02:08:08 crc kubenswrapper[4735]: I0317 02:08:08.670549 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-6zkg9"] Mar 17 02:08:09 crc kubenswrapper[4735]: I0317 02:08:09.084711 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15" path="/var/lib/kubelet/pods/65bc0ef0-d4cc-4be2-9e8e-3ae660ecba15/volumes" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.590695 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:08:55 crc kubenswrapper[4735]: E0317 02:08:55.596472 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311a35f1-29f3-4bf4-a39a-4f642c284101" containerName="oc" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.596498 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="311a35f1-29f3-4bf4-a39a-4f642c284101" containerName="oc" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.598056 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="311a35f1-29f3-4bf4-a39a-4f642c284101" containerName="oc" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.605950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.716406 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnbbc\" (UniqueName: \"kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.716517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.716599 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.771183 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.818198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbbc\" (UniqueName: \"kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.818349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.818445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.829286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.829363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.866575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnbbc\" (UniqueName: \"kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc\") pod \"redhat-marketplace-plrz8\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:55 crc kubenswrapper[4735]: I0317 02:08:55.941020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:08:57 crc kubenswrapper[4735]: I0317 02:08:57.257881 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:08:57 crc kubenswrapper[4735]: I0317 02:08:57.346616 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerStarted","Data":"ac96a6d5d2539803ac9ae3d5e70d5f0cb7fed958fde56b1ec61572f62b6a9767"} Mar 17 02:08:57 crc kubenswrapper[4735]: I0317 02:08:57.498027 4735 scope.go:117] "RemoveContainer" containerID="7764a271913fd8c2fe136730d1067f0bb4ebced129ba62fbfac1151fa1e1002d" Mar 17 02:08:58 crc kubenswrapper[4735]: I0317 02:08:58.356109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerDied","Data":"69d98a8df63c9f3655f6263ee6192606ed8b3c17c15dcf7cb3aea724922586d8"} Mar 17 02:08:58 crc kubenswrapper[4735]: I0317 02:08:58.357338 4735 generic.go:334] "Generic (PLEG): container finished" podID="9aa444e8-d344-4da7-87ae-2af798c38598" containerID="69d98a8df63c9f3655f6263ee6192606ed8b3c17c15dcf7cb3aea724922586d8" exitCode=0 Mar 17 02:09:00 crc kubenswrapper[4735]: I0317 02:09:00.376805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerStarted","Data":"55c4423275572920525cb4f2ec7e5ea202c47409f2f81c288953e3585cc7894b"} Mar 17 02:09:01 crc kubenswrapper[4735]: I0317 02:09:01.385557 4735 generic.go:334] "Generic (PLEG): container finished" podID="9aa444e8-d344-4da7-87ae-2af798c38598" containerID="55c4423275572920525cb4f2ec7e5ea202c47409f2f81c288953e3585cc7894b" exitCode=0 Mar 17 02:09:01 crc kubenswrapper[4735]: I0317 02:09:01.385600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerDied","Data":"55c4423275572920525cb4f2ec7e5ea202c47409f2f81c288953e3585cc7894b"} Mar 17 02:09:02 crc kubenswrapper[4735]: I0317 02:09:02.406006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerStarted","Data":"230d3115664eccbbadcfcd04693866003e454471804f965f356039cbdae3e3b4"} Mar 17 02:09:02 crc kubenswrapper[4735]: I0317 02:09:02.436071 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plrz8" podStartSLOduration=3.901500258 podStartE2EDuration="7.431523532s" podCreationTimestamp="2026-03-17 02:08:55 +0000 UTC" firstStartedPulling="2026-03-17 02:08:58.36149841 +0000 UTC m=+3563.993731398" lastFinishedPulling="2026-03-17 02:09:01.891521684 +0000 UTC m=+3567.523754672" observedRunningTime="2026-03-17 02:09:02.430790204 +0000 UTC m=+3568.063023182" watchObservedRunningTime="2026-03-17 02:09:02.431523532 +0000 UTC m=+3568.063756510" Mar 17 02:09:05 crc kubenswrapper[4735]: I0317 02:09:05.941370 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:05 crc kubenswrapper[4735]: I0317 02:09:05.941943 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:07 crc kubenswrapper[4735]: I0317 02:09:07.036692 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-plrz8" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" probeResult="failure" output=< Mar 17 02:09:07 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:09:07 crc kubenswrapper[4735]: > Mar 17 02:09:17 crc kubenswrapper[4735]: I0317 02:09:17.018695 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-plrz8" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" probeResult="failure" output=< Mar 17 02:09:17 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:09:17 crc kubenswrapper[4735]: > Mar 17 02:09:26 crc kubenswrapper[4735]: I0317 02:09:26.115660 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:26 crc kubenswrapper[4735]: I0317 02:09:26.164205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:26 crc kubenswrapper[4735]: I0317 02:09:26.261177 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:09:27 crc kubenswrapper[4735]: I0317 02:09:27.610902 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-plrz8" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" containerID="cri-o://230d3115664eccbbadcfcd04693866003e454471804f965f356039cbdae3e3b4" gracePeriod=2 Mar 17 02:09:28 crc kubenswrapper[4735]: I0317 02:09:28.625118 4735 generic.go:334] "Generic (PLEG): container finished" podID="9aa444e8-d344-4da7-87ae-2af798c38598" containerID="230d3115664eccbbadcfcd04693866003e454471804f965f356039cbdae3e3b4" exitCode=0 Mar 17 02:09:28 crc kubenswrapper[4735]: I0317 02:09:28.625192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerDied","Data":"230d3115664eccbbadcfcd04693866003e454471804f965f356039cbdae3e3b4"} Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.055535 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.185262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnbbc\" (UniqueName: \"kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc\") pod \"9aa444e8-d344-4da7-87ae-2af798c38598\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.185335 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities\") pod \"9aa444e8-d344-4da7-87ae-2af798c38598\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.185484 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content\") pod \"9aa444e8-d344-4da7-87ae-2af798c38598\" (UID: \"9aa444e8-d344-4da7-87ae-2af798c38598\") " Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.191933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities" (OuterVolumeSpecName: "utilities") pod "9aa444e8-d344-4da7-87ae-2af798c38598" (UID: "9aa444e8-d344-4da7-87ae-2af798c38598"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.221599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc" (OuterVolumeSpecName: "kube-api-access-cnbbc") pod "9aa444e8-d344-4da7-87ae-2af798c38598" (UID: "9aa444e8-d344-4da7-87ae-2af798c38598"). InnerVolumeSpecName "kube-api-access-cnbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.259438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aa444e8-d344-4da7-87ae-2af798c38598" (UID: "9aa444e8-d344-4da7-87ae-2af798c38598"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.288494 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnbbc\" (UniqueName: \"kubernetes.io/projected/9aa444e8-d344-4da7-87ae-2af798c38598-kube-api-access-cnbbc\") on node \"crc\" DevicePath \"\"" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.288847 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.288870 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa444e8-d344-4da7-87ae-2af798c38598-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.637189 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plrz8" event={"ID":"9aa444e8-d344-4da7-87ae-2af798c38598","Type":"ContainerDied","Data":"ac96a6d5d2539803ac9ae3d5e70d5f0cb7fed958fde56b1ec61572f62b6a9767"} Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.638685 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plrz8" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.639084 4735 scope.go:117] "RemoveContainer" containerID="230d3115664eccbbadcfcd04693866003e454471804f965f356039cbdae3e3b4" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.690223 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.699183 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-plrz8"] Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.710961 4735 scope.go:117] "RemoveContainer" containerID="55c4423275572920525cb4f2ec7e5ea202c47409f2f81c288953e3585cc7894b" Mar 17 02:09:29 crc kubenswrapper[4735]: I0317 02:09:29.756504 4735 scope.go:117] "RemoveContainer" containerID="69d98a8df63c9f3655f6263ee6192606ed8b3c17c15dcf7cb3aea724922586d8" Mar 17 02:09:31 crc kubenswrapper[4735]: I0317 02:09:31.089182 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" path="/var/lib/kubelet/pods/9aa444e8-d344-4da7-87ae-2af798c38598/volumes" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.562939 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561890-vt7ht"] Mar 17 02:10:00 crc kubenswrapper[4735]: E0317 02:10:00.569674 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="extract-utilities" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.569716 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="extract-utilities" Mar 17 02:10:00 crc kubenswrapper[4735]: E0317 02:10:00.570523 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.570537 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4735]: E0317 02:10:00.570554 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="extract-content" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.570562 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="extract-content" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.572310 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa444e8-d344-4da7-87ae-2af798c38598" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.579760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.595929 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.595943 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.595929 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.671544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-vt7ht"] Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.724265 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wl4\" (UniqueName: \"kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4\") pod \"auto-csr-approver-29561890-vt7ht\" (UID: \"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae\") " pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.826548 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wl4\" (UniqueName: \"kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4\") pod \"auto-csr-approver-29561890-vt7ht\" (UID: \"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae\") " pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.864892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wl4\" (UniqueName: \"kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4\") pod \"auto-csr-approver-29561890-vt7ht\" (UID: \"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae\") " pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:00 crc kubenswrapper[4735]: I0317 02:10:00.925921 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:02 crc kubenswrapper[4735]: I0317 02:10:02.265968 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-vt7ht"] Mar 17 02:10:02 crc kubenswrapper[4735]: I0317 02:10:02.952082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" event={"ID":"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae","Type":"ContainerStarted","Data":"6f7432da8215d0bdbd91d5c01a8d0b36ce88a557e2e54698e568ada0a0914555"} Mar 17 02:10:04 crc kubenswrapper[4735]: I0317 02:10:04.971983 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" event={"ID":"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae","Type":"ContainerStarted","Data":"60c86c63801bfcf1015535e6240d1df65f2ce60bc2944990bc50b297d878ce74"} Mar 17 02:10:06 crc kubenswrapper[4735]: I0317 02:10:06.989575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" event={"ID":"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae","Type":"ContainerDied","Data":"60c86c63801bfcf1015535e6240d1df65f2ce60bc2944990bc50b297d878ce74"} Mar 17 02:10:06 crc kubenswrapper[4735]: I0317 02:10:06.992084 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" containerID="60c86c63801bfcf1015535e6240d1df65f2ce60bc2944990bc50b297d878ce74" exitCode=0 Mar 17 02:10:08 crc kubenswrapper[4735]: I0317 02:10:08.539383 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:08 crc kubenswrapper[4735]: I0317 02:10:08.703924 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wl4\" (UniqueName: \"kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4\") pod \"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae\" (UID: \"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae\") " Mar 17 02:10:08 crc kubenswrapper[4735]: I0317 02:10:08.727072 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4" (OuterVolumeSpecName: "kube-api-access-w2wl4") pod "6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" (UID: "6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae"). InnerVolumeSpecName "kube-api-access-w2wl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:10:08 crc kubenswrapper[4735]: I0317 02:10:08.809076 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wl4\" (UniqueName: \"kubernetes.io/projected/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae-kube-api-access-w2wl4\") on node \"crc\" DevicePath \"\"" Mar 17 02:10:09 crc kubenswrapper[4735]: I0317 02:10:09.014819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" event={"ID":"6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae","Type":"ContainerDied","Data":"6f7432da8215d0bdbd91d5c01a8d0b36ce88a557e2e54698e568ada0a0914555"} Mar 17 02:10:09 crc kubenswrapper[4735]: I0317 02:10:09.014887 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-vt7ht" Mar 17 02:10:09 crc kubenswrapper[4735]: I0317 02:10:09.014888 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f7432da8215d0bdbd91d5c01a8d0b36ce88a557e2e54698e568ada0a0914555" Mar 17 02:10:09 crc kubenswrapper[4735]: I0317 02:10:09.127985 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-hlvkq"] Mar 17 02:10:09 crc kubenswrapper[4735]: I0317 02:10:09.135969 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-hlvkq"] Mar 17 02:10:11 crc kubenswrapper[4735]: I0317 02:10:11.086567 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8438b716-2caa-4b91-90d3-ecfd2d9804fb" path="/var/lib/kubelet/pods/8438b716-2caa-4b91-90d3-ecfd2d9804fb/volumes" Mar 17 02:10:12 crc kubenswrapper[4735]: I0317 02:10:12.607372 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:10:12 crc kubenswrapper[4735]: I0317 02:10:12.609231 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.265056 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:10:42 crc kubenswrapper[4735]: E0317 02:10:42.272176 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" containerName="oc" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.272223 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" containerName="oc" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.274055 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" containerName="oc" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.283814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.305570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.324625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.324695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.324846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct892\" (UniqueName: \"kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.426411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct892\" (UniqueName: \"kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.426521 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.426550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.431633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.432658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.479118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct892\" (UniqueName: \"kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892\") pod \"certified-operators-6t26j\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.606931 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.608063 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:10:42 crc kubenswrapper[4735]: I0317 02:10:42.612385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:44 crc kubenswrapper[4735]: I0317 02:10:44.043972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:10:44 crc kubenswrapper[4735]: I0317 02:10:44.373414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerDied","Data":"4bb92a251affa8d833a0d3f0edd48ebac449acd3d35fe94e28a1590aaf993e40"} Mar 17 02:10:44 crc kubenswrapper[4735]: I0317 02:10:44.375110 4735 generic.go:334] "Generic (PLEG): container finished" podID="60d60131-db65-4f42-8c18-70c8d9e266be" containerID="4bb92a251affa8d833a0d3f0edd48ebac449acd3d35fe94e28a1590aaf993e40" exitCode=0 Mar 17 02:10:44 crc kubenswrapper[4735]: I0317 02:10:44.375495 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerStarted","Data":"a26564079959105c5c0435b5e821b43ff44e55ea81afbef6b0eac653d09712d4"} Mar 17 02:10:45 crc kubenswrapper[4735]: I0317 02:10:45.391465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerStarted","Data":"5df2e3523da6aef9353f2110b172010d9357c940d955fdf6846bdf57414bfaeb"} Mar 17 02:10:47 crc kubenswrapper[4735]: I0317 02:10:47.409101 4735 generic.go:334] "Generic (PLEG): container finished" podID="60d60131-db65-4f42-8c18-70c8d9e266be" containerID="5df2e3523da6aef9353f2110b172010d9357c940d955fdf6846bdf57414bfaeb" exitCode=0 Mar 17 02:10:47 crc kubenswrapper[4735]: I0317 02:10:47.409215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerDied","Data":"5df2e3523da6aef9353f2110b172010d9357c940d955fdf6846bdf57414bfaeb"} Mar 17 02:10:48 crc kubenswrapper[4735]: I0317 02:10:48.422649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerStarted","Data":"19a5d01f6cdb638d433b7759423179632ab7aa6ec86783d96578579816489246"} Mar 17 02:10:48 crc kubenswrapper[4735]: I0317 02:10:48.469847 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6t26j" podStartSLOduration=2.960711536 podStartE2EDuration="6.464032175s" podCreationTimestamp="2026-03-17 02:10:42 +0000 UTC" firstStartedPulling="2026-03-17 02:10:44.37578587 +0000 UTC m=+3670.008018848" lastFinishedPulling="2026-03-17 02:10:47.879106499 +0000 UTC m=+3673.511339487" observedRunningTime="2026-03-17 02:10:48.45537709 +0000 UTC m=+3674.087610078" watchObservedRunningTime="2026-03-17 02:10:48.464032175 +0000 UTC m=+3674.096265163" Mar 17 02:10:52 crc kubenswrapper[4735]: I0317 02:10:52.615299 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:52 crc kubenswrapper[4735]: I0317 02:10:52.615707 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:10:53 crc kubenswrapper[4735]: I0317 02:10:53.660051 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6t26j" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" probeResult="failure" output=< Mar 17 02:10:53 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:10:53 crc kubenswrapper[4735]: > Mar 17 02:10:57 crc kubenswrapper[4735]: I0317 02:10:57.908121 4735 scope.go:117] "RemoveContainer" containerID="dc58a5c67fcb34b4b07539cefa66c8e6f8e1a857498fbda5851144890d8e292f" Mar 17 02:11:03 crc kubenswrapper[4735]: I0317 02:11:03.767740 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6t26j" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" probeResult="failure" output=< Mar 17 02:11:03 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:11:03 crc kubenswrapper[4735]: > Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.606964 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.608187 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.608227 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.611302 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.612322 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" gracePeriod=600 Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.680948 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:11:12 crc kubenswrapper[4735]: I0317 02:11:12.727372 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:11:12 crc kubenswrapper[4735]: E0317 02:11:12.772615 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:11:13 crc kubenswrapper[4735]: I0317 02:11:13.966953 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:11:13 crc kubenswrapper[4735]: I0317 02:11:13.970165 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" exitCode=0 Mar 17 02:11:13 crc kubenswrapper[4735]: I0317 02:11:13.970394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10"} Mar 17 02:11:13 crc kubenswrapper[4735]: I0317 02:11:13.970443 4735 scope.go:117] "RemoveContainer" containerID="a3d06a6fc67b38d88c34757ec2082c5aad8ceedf120420810b5218b69d71fdd9" Mar 17 02:11:13 crc kubenswrapper[4735]: I0317 02:11:13.971233 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:11:13 crc kubenswrapper[4735]: E0317 02:11:13.971528 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:11:14 crc kubenswrapper[4735]: I0317 02:11:14.979814 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6t26j" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" containerID="cri-o://19a5d01f6cdb638d433b7759423179632ab7aa6ec86783d96578579816489246" gracePeriod=2 Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.011096 4735 generic.go:334] "Generic (PLEG): container finished" podID="60d60131-db65-4f42-8c18-70c8d9e266be" containerID="19a5d01f6cdb638d433b7759423179632ab7aa6ec86783d96578579816489246" exitCode=0 Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.011263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerDied","Data":"19a5d01f6cdb638d433b7759423179632ab7aa6ec86783d96578579816489246"} Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.363505 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.394220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct892\" (UniqueName: \"kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892\") pod \"60d60131-db65-4f42-8c18-70c8d9e266be\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.394582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content\") pod \"60d60131-db65-4f42-8c18-70c8d9e266be\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.394648 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities\") pod \"60d60131-db65-4f42-8c18-70c8d9e266be\" (UID: \"60d60131-db65-4f42-8c18-70c8d9e266be\") " Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.445789 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities" (OuterVolumeSpecName: "utilities") pod "60d60131-db65-4f42-8c18-70c8d9e266be" (UID: "60d60131-db65-4f42-8c18-70c8d9e266be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.478829 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892" (OuterVolumeSpecName: "kube-api-access-ct892") pod "60d60131-db65-4f42-8c18-70c8d9e266be" (UID: "60d60131-db65-4f42-8c18-70c8d9e266be"). InnerVolumeSpecName "kube-api-access-ct892". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.497353 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.497386 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct892\" (UniqueName: \"kubernetes.io/projected/60d60131-db65-4f42-8c18-70c8d9e266be-kube-api-access-ct892\") on node \"crc\" DevicePath \"\"" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.601048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60d60131-db65-4f42-8c18-70c8d9e266be" (UID: "60d60131-db65-4f42-8c18-70c8d9e266be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:11:16 crc kubenswrapper[4735]: I0317 02:11:16.700837 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d60131-db65-4f42-8c18-70c8d9e266be-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.021820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t26j" event={"ID":"60d60131-db65-4f42-8c18-70c8d9e266be","Type":"ContainerDied","Data":"a26564079959105c5c0435b5e821b43ff44e55ea81afbef6b0eac653d09712d4"} Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.021877 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t26j" Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.022123 4735 scope.go:117] "RemoveContainer" containerID="19a5d01f6cdb638d433b7759423179632ab7aa6ec86783d96578579816489246" Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.059113 4735 scope.go:117] "RemoveContainer" containerID="5df2e3523da6aef9353f2110b172010d9357c940d955fdf6846bdf57414bfaeb" Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.060923 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.092988 4735 scope.go:117] "RemoveContainer" containerID="4bb92a251affa8d833a0d3f0edd48ebac449acd3d35fe94e28a1590aaf993e40" Mar 17 02:11:17 crc kubenswrapper[4735]: I0317 02:11:17.094453 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6t26j"] Mar 17 02:11:19 crc kubenswrapper[4735]: I0317 02:11:19.085944 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" path="/var/lib/kubelet/pods/60d60131-db65-4f42-8c18-70c8d9e266be/volumes" Mar 17 02:11:27 crc kubenswrapper[4735]: I0317 02:11:27.077175 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:11:27 crc kubenswrapper[4735]: E0317 02:11:27.078461 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:11:38 crc kubenswrapper[4735]: I0317 02:11:38.074351 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:11:38 crc kubenswrapper[4735]: E0317 02:11:38.075459 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:11:53 crc kubenswrapper[4735]: I0317 02:11:53.074127 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:11:53 crc kubenswrapper[4735]: E0317 02:11:53.075241 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.516188 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vsrk9"] Mar 17 02:12:00 crc kubenswrapper[4735]: E0317 02:12:00.522147 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.522287 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" Mar 17 02:12:00 crc kubenswrapper[4735]: E0317 02:12:00.522333 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="extract-utilities" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.522340 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="extract-utilities" Mar 17 02:12:00 crc kubenswrapper[4735]: E0317 02:12:00.522363 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="extract-content" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.522369 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="extract-content" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.525058 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d60131-db65-4f42-8c18-70c8d9e266be" containerName="registry-server" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.533162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.542515 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.542798 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.543959 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.599389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ktt\" (UniqueName: \"kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt\") pod \"auto-csr-approver-29561892-vsrk9\" (UID: \"f2ef27bd-1f2f-4cee-a78a-ef79b83321de\") " pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.620751 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vsrk9"] Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.701650 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ktt\" (UniqueName: \"kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt\") pod \"auto-csr-approver-29561892-vsrk9\" (UID: \"f2ef27bd-1f2f-4cee-a78a-ef79b83321de\") " pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.745089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ktt\" (UniqueName: \"kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt\") pod \"auto-csr-approver-29561892-vsrk9\" (UID: \"f2ef27bd-1f2f-4cee-a78a-ef79b83321de\") " pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:00 crc kubenswrapper[4735]: I0317 02:12:00.867005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:02 crc kubenswrapper[4735]: I0317 02:12:02.210128 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vsrk9"] Mar 17 02:12:02 crc kubenswrapper[4735]: I0317 02:12:02.265133 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:12:02 crc kubenswrapper[4735]: I0317 02:12:02.433003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" event={"ID":"f2ef27bd-1f2f-4cee-a78a-ef79b83321de","Type":"ContainerStarted","Data":"313d06445f9e3176f022154d65ac32319d4e8dd7aa218ac7d3d8b01e001129a5"} Mar 17 02:12:04 crc kubenswrapper[4735]: I0317 02:12:04.451150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" event={"ID":"f2ef27bd-1f2f-4cee-a78a-ef79b83321de","Type":"ContainerStarted","Data":"f49c69688c59f7491698bdbc129fdd8680f7807a97fa2c94425dbdd0eeada84b"} Mar 17 02:12:04 crc kubenswrapper[4735]: I0317 02:12:04.473467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" podStartSLOduration=3.444734531 podStartE2EDuration="4.472190988s" podCreationTimestamp="2026-03-17 02:12:00 +0000 UTC" firstStartedPulling="2026-03-17 02:12:02.25728142 +0000 UTC m=+3747.889514438" lastFinishedPulling="2026-03-17 02:12:03.284737897 +0000 UTC m=+3748.916970895" observedRunningTime="2026-03-17 02:12:04.468659074 +0000 UTC m=+3750.100892052" watchObservedRunningTime="2026-03-17 02:12:04.472190988 +0000 UTC m=+3750.104423976" Mar 17 02:12:05 crc kubenswrapper[4735]: I0317 02:12:05.094597 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:12:05 crc kubenswrapper[4735]: E0317 02:12:05.094964 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:12:06 crc kubenswrapper[4735]: I0317 02:12:06.471406 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" event={"ID":"f2ef27bd-1f2f-4cee-a78a-ef79b83321de","Type":"ContainerDied","Data":"f49c69688c59f7491698bdbc129fdd8680f7807a97fa2c94425dbdd0eeada84b"} Mar 17 02:12:06 crc kubenswrapper[4735]: I0317 02:12:06.471315 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2ef27bd-1f2f-4cee-a78a-ef79b83321de" containerID="f49c69688c59f7491698bdbc129fdd8680f7807a97fa2c94425dbdd0eeada84b" exitCode=0 Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.022545 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.076640 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ktt\" (UniqueName: \"kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt\") pod \"f2ef27bd-1f2f-4cee-a78a-ef79b83321de\" (UID: \"f2ef27bd-1f2f-4cee-a78a-ef79b83321de\") " Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.100026 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt" (OuterVolumeSpecName: "kube-api-access-g9ktt") pod "f2ef27bd-1f2f-4cee-a78a-ef79b83321de" (UID: "f2ef27bd-1f2f-4cee-a78a-ef79b83321de"). InnerVolumeSpecName "kube-api-access-g9ktt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.178942 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ktt\" (UniqueName: \"kubernetes.io/projected/f2ef27bd-1f2f-4cee-a78a-ef79b83321de-kube-api-access-g9ktt\") on node \"crc\" DevicePath \"\"" Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.498101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" event={"ID":"f2ef27bd-1f2f-4cee-a78a-ef79b83321de","Type":"ContainerDied","Data":"313d06445f9e3176f022154d65ac32319d4e8dd7aa218ac7d3d8b01e001129a5"} Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.499716 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313d06445f9e3176f022154d65ac32319d4e8dd7aa218ac7d3d8b01e001129a5" Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.499822 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vsrk9" Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.591519 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-jhvfb"] Mar 17 02:12:08 crc kubenswrapper[4735]: I0317 02:12:08.603887 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-jhvfb"] Mar 17 02:12:09 crc kubenswrapper[4735]: I0317 02:12:09.093395 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d621d6d4-2334-43f4-8f1b-b3f74ca9dd00" path="/var/lib/kubelet/pods/d621d6d4-2334-43f4-8f1b-b3f74ca9dd00/volumes" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.776227 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:12:16 crc kubenswrapper[4735]: E0317 02:12:16.777300 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ef27bd-1f2f-4cee-a78a-ef79b83321de" containerName="oc" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.777315 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ef27bd-1f2f-4cee-a78a-ef79b83321de" containerName="oc" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.777496 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ef27bd-1f2f-4cee-a78a-ef79b83321de" containerName="oc" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.778995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.802457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.960517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fng7\" (UniqueName: \"kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.960554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:16 crc kubenswrapper[4735]: I0317 02:12:16.960600 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.062216 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fng7\" (UniqueName: \"kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.062294 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.062583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.064710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.065330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.085187 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fng7\" (UniqueName: \"kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7\") pod \"redhat-operators-vr2zf\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.100938 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:17 crc kubenswrapper[4735]: I0317 02:12:17.919921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:12:18 crc kubenswrapper[4735]: I0317 02:12:18.074477 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:12:18 crc kubenswrapper[4735]: E0317 02:12:18.075362 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:12:18 crc kubenswrapper[4735]: I0317 02:12:18.594266 4735 generic.go:334] "Generic (PLEG): container finished" podID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerID="c15af59d9cd1d955f5148cab6a271316696b4c57bb29ddaeb8708d7d8f08aa05" exitCode=0 Mar 17 02:12:18 crc kubenswrapper[4735]: I0317 02:12:18.594587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerDied","Data":"c15af59d9cd1d955f5148cab6a271316696b4c57bb29ddaeb8708d7d8f08aa05"} Mar 17 02:12:18 crc kubenswrapper[4735]: I0317 02:12:18.595361 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerStarted","Data":"19b1583601d4585d881e26dd94689cb5950dac949be9231dd79b85646dd2d9ae"} Mar 17 02:12:19 crc kubenswrapper[4735]: I0317 02:12:19.605964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerStarted","Data":"92a58447c8696390cbabad1becfb9be3de623022cc4fe03e7e92f7dbad926e13"} Mar 17 02:12:25 crc kubenswrapper[4735]: I0317 02:12:25.666981 4735 generic.go:334] "Generic (PLEG): container finished" podID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerID="92a58447c8696390cbabad1becfb9be3de623022cc4fe03e7e92f7dbad926e13" exitCode=0 Mar 17 02:12:25 crc kubenswrapper[4735]: I0317 02:12:25.667219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerDied","Data":"92a58447c8696390cbabad1becfb9be3de623022cc4fe03e7e92f7dbad926e13"} Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.333008 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.341158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.352835 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.475712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhm4\" (UniqueName: \"kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.475915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.476017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.577437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhm4\" (UniqueName: \"kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.577497 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.577530 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.578547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.578550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.616741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhm4\" (UniqueName: \"kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4\") pod \"community-operators-7phjv\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.679266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerStarted","Data":"595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888"} Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.689124 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:26 crc kubenswrapper[4735]: I0317 02:12:26.716036 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vr2zf" podStartSLOduration=3.222634859 podStartE2EDuration="10.714450502s" podCreationTimestamp="2026-03-17 02:12:16 +0000 UTC" firstStartedPulling="2026-03-17 02:12:18.596427322 +0000 UTC m=+3764.228660300" lastFinishedPulling="2026-03-17 02:12:26.088242925 +0000 UTC m=+3771.720475943" observedRunningTime="2026-03-17 02:12:26.706010482 +0000 UTC m=+3772.338243450" watchObservedRunningTime="2026-03-17 02:12:26.714450502 +0000 UTC m=+3772.346683490" Mar 17 02:12:27 crc kubenswrapper[4735]: I0317 02:12:27.102302 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:27 crc kubenswrapper[4735]: I0317 02:12:27.102480 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:12:27 crc kubenswrapper[4735]: I0317 02:12:27.596102 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:12:27 crc kubenswrapper[4735]: I0317 02:12:27.694154 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerStarted","Data":"7d16fd54c985cc2af3f16379373cb4393090f7adbbe1ad22618d6fef9a5bf3a5"} Mar 17 02:12:28 crc kubenswrapper[4735]: I0317 02:12:28.211222 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:28 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:28 crc kubenswrapper[4735]: > Mar 17 02:12:28 crc kubenswrapper[4735]: I0317 02:12:28.704636 4735 generic.go:334] "Generic (PLEG): container finished" podID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerID="7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9" exitCode=0 Mar 17 02:12:28 crc kubenswrapper[4735]: I0317 02:12:28.704782 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerDied","Data":"7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9"} Mar 17 02:12:33 crc kubenswrapper[4735]: I0317 02:12:33.076950 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:12:33 crc kubenswrapper[4735]: E0317 02:12:33.077520 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:12:37 crc kubenswrapper[4735]: I0317 02:12:37.782055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerStarted","Data":"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf"} Mar 17 02:12:38 crc kubenswrapper[4735]: I0317 02:12:38.150249 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:38 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:38 crc kubenswrapper[4735]: > Mar 17 02:12:39 crc kubenswrapper[4735]: I0317 02:12:39.797341 4735 generic.go:334] "Generic (PLEG): container finished" podID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerID="77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf" exitCode=0 Mar 17 02:12:39 crc kubenswrapper[4735]: I0317 02:12:39.797423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerDied","Data":"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf"} Mar 17 02:12:40 crc kubenswrapper[4735]: I0317 02:12:40.816401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerStarted","Data":"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77"} Mar 17 02:12:40 crc kubenswrapper[4735]: I0317 02:12:40.847538 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7phjv" podStartSLOduration=3.253523287 podStartE2EDuration="14.847515777s" podCreationTimestamp="2026-03-17 02:12:26 +0000 UTC" firstStartedPulling="2026-03-17 02:12:28.708462129 +0000 UTC m=+3774.340695117" lastFinishedPulling="2026-03-17 02:12:40.302454629 +0000 UTC m=+3785.934687607" observedRunningTime="2026-03-17 02:12:40.838024071 +0000 UTC m=+3786.470257049" watchObservedRunningTime="2026-03-17 02:12:40.847515777 +0000 UTC m=+3786.479748755" Mar 17 02:12:46 crc kubenswrapper[4735]: I0317 02:12:46.076157 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:12:46 crc kubenswrapper[4735]: E0317 02:12:46.079563 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:12:46 crc kubenswrapper[4735]: I0317 02:12:46.690027 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:46 crc kubenswrapper[4735]: I0317 02:12:46.690399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:12:47 crc kubenswrapper[4735]: I0317 02:12:47.785371 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7phjv" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:47 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:47 crc kubenswrapper[4735]: > Mar 17 02:12:48 crc kubenswrapper[4735]: I0317 02:12:48.147970 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:48 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:48 crc kubenswrapper[4735]: > Mar 17 02:12:57 crc kubenswrapper[4735]: I0317 02:12:57.791244 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7phjv" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:57 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:57 crc kubenswrapper[4735]: > Mar 17 02:12:58 crc kubenswrapper[4735]: I0317 02:12:58.146225 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:12:58 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:12:58 crc kubenswrapper[4735]: > Mar 17 02:12:58 crc kubenswrapper[4735]: I0317 02:12:58.309583 4735 scope.go:117] "RemoveContainer" containerID="b245f5bed36bf6117ed46cedcef2a8749c6012f09b174636f3460266f83b23cc" Mar 17 02:12:59 crc kubenswrapper[4735]: I0317 02:12:59.074658 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:12:59 crc kubenswrapper[4735]: E0317 02:12:59.075585 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:13:06 crc kubenswrapper[4735]: I0317 02:13:06.884417 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:13:06 crc kubenswrapper[4735]: I0317 02:13:06.930318 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:13:07 crc kubenswrapper[4735]: I0317 02:13:07.447583 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:13:07 crc kubenswrapper[4735]: I0317 02:13:07.491486 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 02:13:07 crc kubenswrapper[4735]: I0317 02:13:07.499709 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6ssc" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="registry-server" containerID="cri-o://6b34533a5ae373b7adc1035147c72c494f45e0974d3cdd0ea4f48b57797ef6f3" gracePeriod=2 Mar 17 02:13:08 crc kubenswrapper[4735]: I0317 02:13:08.098181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerDied","Data":"6b34533a5ae373b7adc1035147c72c494f45e0974d3cdd0ea4f48b57797ef6f3"} Mar 17 02:13:08 crc kubenswrapper[4735]: I0317 02:13:08.099922 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerID="6b34533a5ae373b7adc1035147c72c494f45e0974d3cdd0ea4f48b57797ef6f3" exitCode=0 Mar 17 02:13:08 crc kubenswrapper[4735]: I0317 02:13:08.150723 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:13:08 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:13:08 crc kubenswrapper[4735]: > Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.013134 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.074224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlshr\" (UniqueName: \"kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr\") pod \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.074406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities\") pod \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.074814 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content\") pod \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\" (UID: \"8c41bd0b-1749-486f-a7c6-c86362e3c03c\") " Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.080042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities" (OuterVolumeSpecName: "utilities") pod "8c41bd0b-1749-486f-a7c6-c86362e3c03c" (UID: "8c41bd0b-1749-486f-a7c6-c86362e3c03c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.104039 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr" (OuterVolumeSpecName: "kube-api-access-vlshr") pod "8c41bd0b-1749-486f-a7c6-c86362e3c03c" (UID: "8c41bd0b-1749-486f-a7c6-c86362e3c03c"). InnerVolumeSpecName "kube-api-access-vlshr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.141277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6ssc" event={"ID":"8c41bd0b-1749-486f-a7c6-c86362e3c03c","Type":"ContainerDied","Data":"c9b373c6ad1ca5ad288d4222bf12a32155b7a8f581bd8a419712c013d165dbac"} Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.142954 4735 scope.go:117] "RemoveContainer" containerID="6b34533a5ae373b7adc1035147c72c494f45e0974d3cdd0ea4f48b57797ef6f3" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.143116 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6ssc" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.177431 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlshr\" (UniqueName: \"kubernetes.io/projected/8c41bd0b-1749-486f-a7c6-c86362e3c03c-kube-api-access-vlshr\") on node \"crc\" DevicePath \"\"" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.177470 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.199874 4735 scope.go:117] "RemoveContainer" containerID="075e8800cbf5e47ac99627981e2d61f9cdbde20ed9985443a1be16919c8cdc8e" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.256352 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c41bd0b-1749-486f-a7c6-c86362e3c03c" (UID: "8c41bd0b-1749-486f-a7c6-c86362e3c03c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.269960 4735 scope.go:117] "RemoveContainer" containerID="544ca71f24a1553dace12d8780827c9b29d4ae04c20385ded1a1128723898622" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.279027 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c41bd0b-1749-486f-a7c6-c86362e3c03c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.487011 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 02:13:09 crc kubenswrapper[4735]: I0317 02:13:09.505446 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6ssc"] Mar 17 02:13:11 crc kubenswrapper[4735]: I0317 02:13:11.083825 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" path="/var/lib/kubelet/pods/8c41bd0b-1749-486f-a7c6-c86362e3c03c/volumes" Mar 17 02:13:14 crc kubenswrapper[4735]: I0317 02:13:14.073712 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:13:14 crc kubenswrapper[4735]: E0317 02:13:14.077985 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:13:18 crc kubenswrapper[4735]: I0317 02:13:18.158236 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:13:18 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:13:18 crc kubenswrapper[4735]: > Mar 17 02:13:25 crc kubenswrapper[4735]: I0317 02:13:25.088185 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:13:25 crc kubenswrapper[4735]: E0317 02:13:25.090323 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:13:28 crc kubenswrapper[4735]: I0317 02:13:28.149875 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:13:28 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:13:28 crc kubenswrapper[4735]: > Mar 17 02:13:36 crc kubenswrapper[4735]: I0317 02:13:36.074266 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:13:36 crc kubenswrapper[4735]: E0317 02:13:36.077097 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:13:38 crc kubenswrapper[4735]: I0317 02:13:38.147920 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:13:38 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:13:38 crc kubenswrapper[4735]: > Mar 17 02:13:48 crc kubenswrapper[4735]: I0317 02:13:48.155118 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" probeResult="failure" output=< Mar 17 02:13:48 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:13:48 crc kubenswrapper[4735]: > Mar 17 02:13:51 crc kubenswrapper[4735]: I0317 02:13:51.073700 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:13:51 crc kubenswrapper[4735]: E0317 02:13:51.074337 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:13:57 crc kubenswrapper[4735]: I0317 02:13:57.175058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:13:57 crc kubenswrapper[4735]: I0317 02:13:57.259999 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:13:57 crc kubenswrapper[4735]: I0317 02:13:57.456814 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:13:58 crc kubenswrapper[4735]: I0317 02:13:58.550605 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vr2zf" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" containerID="cri-o://595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888" gracePeriod=2 Mar 17 02:13:58 crc kubenswrapper[4735]: E0317 02:13:58.959187 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964d64db_9b70_41f9_b8a1_10b844ba9f79.slice/crio-595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964d64db_9b70_41f9_b8a1_10b844ba9f79.slice/crio-conmon-595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888.scope\": RecentStats: unable to find data in memory cache]" Mar 17 02:13:59 crc kubenswrapper[4735]: I0317 02:13:59.560304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerDied","Data":"595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888"} Mar 17 02:13:59 crc kubenswrapper[4735]: I0317 02:13:59.561026 4735 generic.go:334] "Generic (PLEG): container finished" podID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerID="595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888" exitCode=0 Mar 17 02:13:59 crc kubenswrapper[4735]: I0317 02:13:59.967286 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.054450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities\") pod \"964d64db-9b70-41f9-b8a1-10b844ba9f79\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.054508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content\") pod \"964d64db-9b70-41f9-b8a1-10b844ba9f79\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.054596 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fng7\" (UniqueName: \"kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7\") pod \"964d64db-9b70-41f9-b8a1-10b844ba9f79\" (UID: \"964d64db-9b70-41f9-b8a1-10b844ba9f79\") " Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.058284 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities" (OuterVolumeSpecName: "utilities") pod "964d64db-9b70-41f9-b8a1-10b844ba9f79" (UID: "964d64db-9b70-41f9-b8a1-10b844ba9f79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.092081 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7" (OuterVolumeSpecName: "kube-api-access-7fng7") pod "964d64db-9b70-41f9-b8a1-10b844ba9f79" (UID: "964d64db-9b70-41f9-b8a1-10b844ba9f79"). InnerVolumeSpecName "kube-api-access-7fng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.157578 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fng7\" (UniqueName: \"kubernetes.io/projected/964d64db-9b70-41f9-b8a1-10b844ba9f79-kube-api-access-7fng7\") on node \"crc\" DevicePath \"\"" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.157614 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.319870 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "964d64db-9b70-41f9-b8a1-10b844ba9f79" (UID: "964d64db-9b70-41f9-b8a1-10b844ba9f79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.361245 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964d64db-9b70-41f9-b8a1-10b844ba9f79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.484285 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561894-dkm2j"] Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.488515 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.488547 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.488895 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.488907 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.488923 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.488931 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.488979 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.488996 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.489013 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.489020 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: E0317 02:14:00.489039 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.489048 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.490625 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.490652 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c41bd0b-1749-486f-a7c6-c86362e3c03c" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.496012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.514834 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.514834 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.514834 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.564719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2dj\" (UniqueName: \"kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj\") pod \"auto-csr-approver-29561894-dkm2j\" (UID: \"57ff207e-f79e-4d00-8d90-a069aecbbf08\") " pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.570916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vr2zf" event={"ID":"964d64db-9b70-41f9-b8a1-10b844ba9f79","Type":"ContainerDied","Data":"19b1583601d4585d881e26dd94689cb5950dac949be9231dd79b85646dd2d9ae"} Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.570955 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vr2zf" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.570980 4735 scope.go:117] "RemoveContainer" containerID="595ad13f3c3e9ceb4ed0ed9d5209ed1c09d433757065aa6ab2f5df91c8642888" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.593644 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-dkm2j"] Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.603942 4735 scope.go:117] "RemoveContainer" containerID="92a58447c8696390cbabad1becfb9be3de623022cc4fe03e7e92f7dbad926e13" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.628270 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.629140 4735 scope.go:117] "RemoveContainer" containerID="c15af59d9cd1d955f5148cab6a271316696b4c57bb29ddaeb8708d7d8f08aa05" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.637892 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vr2zf"] Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.666534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2dj\" (UniqueName: \"kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj\") pod \"auto-csr-approver-29561894-dkm2j\" (UID: \"57ff207e-f79e-4d00-8d90-a069aecbbf08\") " pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.688571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2dj\" (UniqueName: \"kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj\") pod \"auto-csr-approver-29561894-dkm2j\" (UID: \"57ff207e-f79e-4d00-8d90-a069aecbbf08\") " pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:00 crc kubenswrapper[4735]: I0317 02:14:00.825142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:01 crc kubenswrapper[4735]: I0317 02:14:01.083677 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964d64db-9b70-41f9-b8a1-10b844ba9f79" path="/var/lib/kubelet/pods/964d64db-9b70-41f9-b8a1-10b844ba9f79/volumes" Mar 17 02:14:01 crc kubenswrapper[4735]: I0317 02:14:01.398229 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-dkm2j"] Mar 17 02:14:01 crc kubenswrapper[4735]: I0317 02:14:01.581468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" event={"ID":"57ff207e-f79e-4d00-8d90-a069aecbbf08","Type":"ContainerStarted","Data":"c10251f0037fba52b330d381e15b5e60135283de86454b72ca8ee7275f326935"} Mar 17 02:14:04 crc kubenswrapper[4735]: I0317 02:14:04.608965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" event={"ID":"57ff207e-f79e-4d00-8d90-a069aecbbf08","Type":"ContainerStarted","Data":"ee388b79d64271f7deeb9aae628d19279eb90bfae6f0a952e27471fe6efd8865"} Mar 17 02:14:04 crc kubenswrapper[4735]: I0317 02:14:04.632951 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" podStartSLOduration=3.367538295 podStartE2EDuration="4.630988373s" podCreationTimestamp="2026-03-17 02:14:00 +0000 UTC" firstStartedPulling="2026-03-17 02:14:01.423301163 +0000 UTC m=+3867.055534141" lastFinishedPulling="2026-03-17 02:14:02.686751231 +0000 UTC m=+3868.318984219" observedRunningTime="2026-03-17 02:14:04.623400123 +0000 UTC m=+3870.255633101" watchObservedRunningTime="2026-03-17 02:14:04.630988373 +0000 UTC m=+3870.263221351" Mar 17 02:14:05 crc kubenswrapper[4735]: I0317 02:14:05.079768 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:14:05 crc kubenswrapper[4735]: E0317 02:14:05.080122 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:14:05 crc kubenswrapper[4735]: I0317 02:14:05.617922 4735 generic.go:334] "Generic (PLEG): container finished" podID="57ff207e-f79e-4d00-8d90-a069aecbbf08" containerID="ee388b79d64271f7deeb9aae628d19279eb90bfae6f0a952e27471fe6efd8865" exitCode=0 Mar 17 02:14:05 crc kubenswrapper[4735]: I0317 02:14:05.617962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" event={"ID":"57ff207e-f79e-4d00-8d90-a069aecbbf08","Type":"ContainerDied","Data":"ee388b79d64271f7deeb9aae628d19279eb90bfae6f0a952e27471fe6efd8865"} Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.147464 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.330703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x2dj\" (UniqueName: \"kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj\") pod \"57ff207e-f79e-4d00-8d90-a069aecbbf08\" (UID: \"57ff207e-f79e-4d00-8d90-a069aecbbf08\") " Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.355437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj" (OuterVolumeSpecName: "kube-api-access-9x2dj") pod "57ff207e-f79e-4d00-8d90-a069aecbbf08" (UID: "57ff207e-f79e-4d00-8d90-a069aecbbf08"). InnerVolumeSpecName "kube-api-access-9x2dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.433104 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x2dj\" (UniqueName: \"kubernetes.io/projected/57ff207e-f79e-4d00-8d90-a069aecbbf08-kube-api-access-9x2dj\") on node \"crc\" DevicePath \"\"" Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.637820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" event={"ID":"57ff207e-f79e-4d00-8d90-a069aecbbf08","Type":"ContainerDied","Data":"c10251f0037fba52b330d381e15b5e60135283de86454b72ca8ee7275f326935"} Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.637902 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10251f0037fba52b330d381e15b5e60135283de86454b72ca8ee7275f326935" Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.637982 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-dkm2j" Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.719256 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-scvlj"] Mar 17 02:14:07 crc kubenswrapper[4735]: I0317 02:14:07.726562 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-scvlj"] Mar 17 02:14:09 crc kubenswrapper[4735]: I0317 02:14:09.084949 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311a35f1-29f3-4bf4-a39a-4f642c284101" path="/var/lib/kubelet/pods/311a35f1-29f3-4bf4-a39a-4f642c284101/volumes" Mar 17 02:14:17 crc kubenswrapper[4735]: I0317 02:14:17.072850 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:14:17 crc kubenswrapper[4735]: E0317 02:14:17.073622 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:14:30 crc kubenswrapper[4735]: I0317 02:14:30.073450 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:14:30 crc kubenswrapper[4735]: E0317 02:14:30.074491 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:14:43 crc kubenswrapper[4735]: I0317 02:14:43.074400 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:14:43 crc kubenswrapper[4735]: E0317 02:14:43.075529 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:14:58 crc kubenswrapper[4735]: I0317 02:14:58.073297 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:14:58 crc kubenswrapper[4735]: E0317 02:14:58.074512 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:14:58 crc kubenswrapper[4735]: I0317 02:14:58.932305 4735 scope.go:117] "RemoveContainer" containerID="d359be0e1c53ed68c07664b05b23c3a184d4b1ee71f64868c751311ab2845205" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.513335 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd"] Mar 17 02:15:00 crc kubenswrapper[4735]: E0317 02:15:00.520348 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ff207e-f79e-4d00-8d90-a069aecbbf08" containerName="oc" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.520537 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ff207e-f79e-4d00-8d90-a069aecbbf08" containerName="oc" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.524969 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ff207e-f79e-4d00-8d90-a069aecbbf08" containerName="oc" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.538202 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.553774 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.554139 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.676670 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd"] Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.700700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.700812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.700930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7w92\" (UniqueName: \"kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.803156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.803277 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7w92\" (UniqueName: \"kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.803349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.804523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.829679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7w92\" (UniqueName: \"kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.832375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume\") pod \"collect-profiles-29561895-mdjwd\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:00 crc kubenswrapper[4735]: I0317 02:15:00.871298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:02 crc kubenswrapper[4735]: I0317 02:15:02.331696 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd"] Mar 17 02:15:02 crc kubenswrapper[4735]: I0317 02:15:02.691306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" event={"ID":"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261","Type":"ContainerStarted","Data":"f34d6f2c3caec0f505484a3deb500b2505d65a334c7f4d594c950618ec32c808"} Mar 17 02:15:02 crc kubenswrapper[4735]: I0317 02:15:02.692000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" event={"ID":"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261","Type":"ContainerStarted","Data":"b9306c0666c2475f3186e7a02fa18deaf1d2feee31143ba8e285e10ab89bbe67"} Mar 17 02:15:02 crc kubenswrapper[4735]: I0317 02:15:02.714810 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" podStartSLOduration=2.71364228 podStartE2EDuration="2.71364228s" podCreationTimestamp="2026-03-17 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:15:02.706618673 +0000 UTC m=+3928.338851651" watchObservedRunningTime="2026-03-17 02:15:02.71364228 +0000 UTC m=+3928.345875258" Mar 17 02:15:03 crc kubenswrapper[4735]: I0317 02:15:03.700835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" event={"ID":"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261","Type":"ContainerDied","Data":"f34d6f2c3caec0f505484a3deb500b2505d65a334c7f4d594c950618ec32c808"} Mar 17 02:15:03 crc kubenswrapper[4735]: I0317 02:15:03.702440 4735 generic.go:334] "Generic (PLEG): container finished" podID="3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" containerID="f34d6f2c3caec0f505484a3deb500b2505d65a334c7f4d594c950618ec32c808" exitCode=0 Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.268386 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.387961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume\") pod \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.388069 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume\") pod \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.388117 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7w92\" (UniqueName: \"kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92\") pod \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\" (UID: \"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261\") " Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.394619 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume" (OuterVolumeSpecName: "config-volume") pod "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" (UID: "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.420893 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92" (OuterVolumeSpecName: "kube-api-access-j7w92") pod "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" (UID: "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261"). InnerVolumeSpecName "kube-api-access-j7w92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.421259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" (UID: "3be5fc0c-e7b0-45cd-9fce-ed5dde74a261"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.431592 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf"] Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.475475 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-vlxtf"] Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.491547 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.491587 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7w92\" (UniqueName: \"kubernetes.io/projected/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-kube-api-access-j7w92\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.491600 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.723260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" event={"ID":"3be5fc0c-e7b0-45cd-9fce-ed5dde74a261","Type":"ContainerDied","Data":"b9306c0666c2475f3186e7a02fa18deaf1d2feee31143ba8e285e10ab89bbe67"} Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.723345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd" Mar 17 02:15:05 crc kubenswrapper[4735]: I0317 02:15:05.723978 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9306c0666c2475f3186e7a02fa18deaf1d2feee31143ba8e285e10ab89bbe67" Mar 17 02:15:07 crc kubenswrapper[4735]: I0317 02:15:07.085744 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f202568-8dcd-4ba1-8b79-a80560cfcd1a" path="/var/lib/kubelet/pods/6f202568-8dcd-4ba1-8b79-a80560cfcd1a/volumes" Mar 17 02:15:12 crc kubenswrapper[4735]: I0317 02:15:12.076893 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:15:12 crc kubenswrapper[4735]: E0317 02:15:12.081290 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:15:27 crc kubenswrapper[4735]: I0317 02:15:27.077147 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:15:27 crc kubenswrapper[4735]: E0317 02:15:27.081347 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:15:41 crc kubenswrapper[4735]: I0317 02:15:41.074205 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:15:41 crc kubenswrapper[4735]: E0317 02:15:41.074969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:15:56 crc kubenswrapper[4735]: I0317 02:15:56.073348 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:15:56 crc kubenswrapper[4735]: E0317 02:15:56.074250 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:15:59 crc kubenswrapper[4735]: I0317 02:15:59.222559 4735 scope.go:117] "RemoveContainer" containerID="df368615a720d15b95af785eff55f70fe59fc4a47f7f825aefce5b12f370944b" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.580161 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561896-wzbkz"] Mar 17 02:16:00 crc kubenswrapper[4735]: E0317 02:16:00.586955 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" containerName="collect-profiles" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.586983 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" containerName="collect-profiles" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.588835 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" containerName="collect-profiles" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.596156 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.610900 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.611687 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.641408 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.699992 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-wzbkz"] Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.717061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nckz\" (UniqueName: \"kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz\") pod \"auto-csr-approver-29561896-wzbkz\" (UID: \"c47b06d0-0fa2-4911-947f-68fb142f0aba\") " pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.819110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nckz\" (UniqueName: \"kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz\") pod \"auto-csr-approver-29561896-wzbkz\" (UID: \"c47b06d0-0fa2-4911-947f-68fb142f0aba\") " pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.867140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nckz\" (UniqueName: \"kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz\") pod \"auto-csr-approver-29561896-wzbkz\" (UID: \"c47b06d0-0fa2-4911-947f-68fb142f0aba\") " pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:00 crc kubenswrapper[4735]: I0317 02:16:00.938838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:02 crc kubenswrapper[4735]: I0317 02:16:02.698337 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-wzbkz"] Mar 17 02:16:03 crc kubenswrapper[4735]: I0317 02:16:03.627606 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" event={"ID":"c47b06d0-0fa2-4911-947f-68fb142f0aba","Type":"ContainerStarted","Data":"4e01bde908c48f45ed1bd3f9d05697513857e3b962aaadbc8a2e57665e758c89"} Mar 17 02:16:05 crc kubenswrapper[4735]: I0317 02:16:05.647302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" event={"ID":"c47b06d0-0fa2-4911-947f-68fb142f0aba","Type":"ContainerStarted","Data":"c6f761a685ff02965f06a78e8297ab783b2098f974f0618d857cee9c37e4f6c4"} Mar 17 02:16:05 crc kubenswrapper[4735]: I0317 02:16:05.667549 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" podStartSLOduration=4.630854705 podStartE2EDuration="5.666199129s" podCreationTimestamp="2026-03-17 02:16:00 +0000 UTC" firstStartedPulling="2026-03-17 02:16:02.745213209 +0000 UTC m=+3988.377446177" lastFinishedPulling="2026-03-17 02:16:03.780557623 +0000 UTC m=+3989.412790601" observedRunningTime="2026-03-17 02:16:05.664082079 +0000 UTC m=+3991.296315077" watchObservedRunningTime="2026-03-17 02:16:05.666199129 +0000 UTC m=+3991.298432107" Mar 17 02:16:07 crc kubenswrapper[4735]: I0317 02:16:07.075400 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:16:07 crc kubenswrapper[4735]: E0317 02:16:07.085596 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:16:07 crc kubenswrapper[4735]: I0317 02:16:07.662809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" event={"ID":"c47b06d0-0fa2-4911-947f-68fb142f0aba","Type":"ContainerDied","Data":"c6f761a685ff02965f06a78e8297ab783b2098f974f0618d857cee9c37e4f6c4"} Mar 17 02:16:07 crc kubenswrapper[4735]: I0317 02:16:07.662719 4735 generic.go:334] "Generic (PLEG): container finished" podID="c47b06d0-0fa2-4911-947f-68fb142f0aba" containerID="c6f761a685ff02965f06a78e8297ab783b2098f974f0618d857cee9c37e4f6c4" exitCode=0 Mar 17 02:16:09 crc kubenswrapper[4735]: I0317 02:16:09.904121 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.015309 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nckz\" (UniqueName: \"kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz\") pod \"c47b06d0-0fa2-4911-947f-68fb142f0aba\" (UID: \"c47b06d0-0fa2-4911-947f-68fb142f0aba\") " Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.052139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz" (OuterVolumeSpecName: "kube-api-access-9nckz") pod "c47b06d0-0fa2-4911-947f-68fb142f0aba" (UID: "c47b06d0-0fa2-4911-947f-68fb142f0aba"). InnerVolumeSpecName "kube-api-access-9nckz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.118315 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nckz\" (UniqueName: \"kubernetes.io/projected/c47b06d0-0fa2-4911-947f-68fb142f0aba-kube-api-access-9nckz\") on node \"crc\" DevicePath \"\"" Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.698956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" event={"ID":"c47b06d0-0fa2-4911-947f-68fb142f0aba","Type":"ContainerDied","Data":"4e01bde908c48f45ed1bd3f9d05697513857e3b962aaadbc8a2e57665e758c89"} Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.699048 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-wzbkz" Mar 17 02:16:10 crc kubenswrapper[4735]: I0317 02:16:10.700381 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e01bde908c48f45ed1bd3f9d05697513857e3b962aaadbc8a2e57665e758c89" Mar 17 02:16:11 crc kubenswrapper[4735]: I0317 02:16:11.042731 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-vt7ht"] Mar 17 02:16:11 crc kubenswrapper[4735]: I0317 02:16:11.050934 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-vt7ht"] Mar 17 02:16:11 crc kubenswrapper[4735]: I0317 02:16:11.087155 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae" path="/var/lib/kubelet/pods/6b7ccf69-72c6-44c7-bbeb-7d570e6ba7ae/volumes" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.490900 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.491251 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.490433 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rxdbc container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.491347 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podUID="4299c484-7f23-433c-a4b9-d3daf740fa7c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524020 4735 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524225 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524302 4735 patch_prober.go:28] interesting pod/router-default-5444994796-s2dvf container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524320 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-s2dvf" podUID="87ef90f4-2d13-4f4c-9bbc-0b438b75a901" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524362 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rxdbc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.524377 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rxdbc" podUID="4299c484-7f23-433c-a4b9-d3daf740fa7c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:14 crc kubenswrapper[4735]: I0317 02:16:14.538698 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-g9frk" podUID="2c5a87ba-17e0-4807-980f-f42af0ffb51a" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.82:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 02:16:22 crc kubenswrapper[4735]: I0317 02:16:22.076407 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:16:22 crc kubenswrapper[4735]: I0317 02:16:22.822145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5"} Mar 17 02:16:59 crc kubenswrapper[4735]: I0317 02:16:59.457700 4735 scope.go:117] "RemoveContainer" containerID="60c86c63801bfcf1015535e6240d1df65f2ce60bc2944990bc50b297d878ce74" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.657074 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561898-p9d4m"] Mar 17 02:18:00 crc kubenswrapper[4735]: E0317 02:18:00.661994 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47b06d0-0fa2-4911-947f-68fb142f0aba" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.662035 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47b06d0-0fa2-4911-947f-68fb142f0aba" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.663667 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47b06d0-0fa2-4911-947f-68fb142f0aba" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.674405 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.729653 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.729670 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.729694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.826347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wcc\" (UniqueName: \"kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc\") pod \"auto-csr-approver-29561898-p9d4m\" (UID: \"c0e623a3-6185-4391-a462-a08bebec09a3\") " pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.840396 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-p9d4m"] Mar 17 02:18:00 crc kubenswrapper[4735]: I0317 02:18:00.928461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wcc\" (UniqueName: \"kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc\") pod \"auto-csr-approver-29561898-p9d4m\" (UID: \"c0e623a3-6185-4391-a462-a08bebec09a3\") " pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:01 crc kubenswrapper[4735]: I0317 02:18:01.148625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wcc\" (UniqueName: \"kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc\") pod \"auto-csr-approver-29561898-p9d4m\" (UID: \"c0e623a3-6185-4391-a462-a08bebec09a3\") " pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:01 crc kubenswrapper[4735]: I0317 02:18:01.335371 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:02 crc kubenswrapper[4735]: I0317 02:18:02.857051 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-p9d4m"] Mar 17 02:18:02 crc kubenswrapper[4735]: I0317 02:18:02.936276 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:18:03 crc kubenswrapper[4735]: I0317 02:18:03.743502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" event={"ID":"c0e623a3-6185-4391-a462-a08bebec09a3","Type":"ContainerStarted","Data":"14c0f5f32b634b62d80c6a84fd4cba7a0b6c7a3a1e5ba7cff540ed88ea4be2ff"} Mar 17 02:18:05 crc kubenswrapper[4735]: I0317 02:18:05.761190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" event={"ID":"c0e623a3-6185-4391-a462-a08bebec09a3","Type":"ContainerStarted","Data":"fcac3afcc18af5afb35add028c77e4e7088d310edd7c000a8a1120cb6471c56a"} Mar 17 02:18:05 crc kubenswrapper[4735]: I0317 02:18:05.788334 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" podStartSLOduration=4.8706543799999995 podStartE2EDuration="5.785718234s" podCreationTimestamp="2026-03-17 02:18:00 +0000 UTC" firstStartedPulling="2026-03-17 02:18:02.928445705 +0000 UTC m=+4108.560678683" lastFinishedPulling="2026-03-17 02:18:03.843509559 +0000 UTC m=+4109.475742537" observedRunningTime="2026-03-17 02:18:05.778136073 +0000 UTC m=+4111.410369051" watchObservedRunningTime="2026-03-17 02:18:05.785718234 +0000 UTC m=+4111.417951212" Mar 17 02:18:06 crc kubenswrapper[4735]: I0317 02:18:06.771815 4735 generic.go:334] "Generic (PLEG): container finished" podID="c0e623a3-6185-4391-a462-a08bebec09a3" containerID="fcac3afcc18af5afb35add028c77e4e7088d310edd7c000a8a1120cb6471c56a" exitCode=0 Mar 17 02:18:06 crc kubenswrapper[4735]: I0317 02:18:06.771874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" event={"ID":"c0e623a3-6185-4391-a462-a08bebec09a3","Type":"ContainerDied","Data":"fcac3afcc18af5afb35add028c77e4e7088d310edd7c000a8a1120cb6471c56a"} Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.428679 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.508533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wcc\" (UniqueName: \"kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc\") pod \"c0e623a3-6185-4391-a462-a08bebec09a3\" (UID: \"c0e623a3-6185-4391-a462-a08bebec09a3\") " Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.530607 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc" (OuterVolumeSpecName: "kube-api-access-t2wcc") pod "c0e623a3-6185-4391-a462-a08bebec09a3" (UID: "c0e623a3-6185-4391-a462-a08bebec09a3"). InnerVolumeSpecName "kube-api-access-t2wcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.610795 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wcc\" (UniqueName: \"kubernetes.io/projected/c0e623a3-6185-4391-a462-a08bebec09a3-kube-api-access-t2wcc\") on node \"crc\" DevicePath \"\"" Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.789463 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" event={"ID":"c0e623a3-6185-4391-a462-a08bebec09a3","Type":"ContainerDied","Data":"14c0f5f32b634b62d80c6a84fd4cba7a0b6c7a3a1e5ba7cff540ed88ea4be2ff"} Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.789710 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c0f5f32b634b62d80c6a84fd4cba7a0b6c7a3a1e5ba7cff540ed88ea4be2ff" Mar 17 02:18:08 crc kubenswrapper[4735]: I0317 02:18:08.789531 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-p9d4m" Mar 17 02:18:09 crc kubenswrapper[4735]: I0317 02:18:09.544332 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vsrk9"] Mar 17 02:18:09 crc kubenswrapper[4735]: I0317 02:18:09.555811 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vsrk9"] Mar 17 02:18:11 crc kubenswrapper[4735]: I0317 02:18:11.089330 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ef27bd-1f2f-4cee-a78a-ef79b83321de" path="/var/lib/kubelet/pods/f2ef27bd-1f2f-4cee-a78a-ef79b83321de/volumes" Mar 17 02:18:42 crc kubenswrapper[4735]: I0317 02:18:42.608804 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:18:42 crc kubenswrapper[4735]: I0317 02:18:42.612717 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:18:59 crc kubenswrapper[4735]: I0317 02:18:59.914018 4735 scope.go:117] "RemoveContainer" containerID="f49c69688c59f7491698bdbc129fdd8680f7807a97fa2c94425dbdd0eeada84b" Mar 17 02:19:12 crc kubenswrapper[4735]: I0317 02:19:12.606183 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:19:12 crc kubenswrapper[4735]: I0317 02:19:12.606716 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:19:27 crc kubenswrapper[4735]: I0317 02:19:27.894279 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:27 crc kubenswrapper[4735]: E0317 02:19:27.898656 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e623a3-6185-4391-a462-a08bebec09a3" containerName="oc" Mar 17 02:19:27 crc kubenswrapper[4735]: I0317 02:19:27.898691 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e623a3-6185-4391-a462-a08bebec09a3" containerName="oc" Mar 17 02:19:27 crc kubenswrapper[4735]: I0317 02:19:27.900702 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e623a3-6185-4391-a462-a08bebec09a3" containerName="oc" Mar 17 02:19:27 crc kubenswrapper[4735]: I0317 02:19:27.910518 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.024900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.025025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vks87\" (UniqueName: \"kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.025072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.075734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.127207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vks87\" (UniqueName: \"kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.127276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.127445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.130001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.130477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.174793 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vks87\" (UniqueName: \"kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87\") pod \"redhat-marketplace-mn589\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:28 crc kubenswrapper[4735]: I0317 02:19:28.249608 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:29 crc kubenswrapper[4735]: I0317 02:19:29.682044 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:30 crc kubenswrapper[4735]: I0317 02:19:30.641791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerDied","Data":"eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa"} Mar 17 02:19:30 crc kubenswrapper[4735]: I0317 02:19:30.642291 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerID="eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa" exitCode=0 Mar 17 02:19:30 crc kubenswrapper[4735]: I0317 02:19:30.643239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerStarted","Data":"9015939f452040e63b527a61d907f5e5b76e4d96f8c1eb0c54e9548f6b38cbb8"} Mar 17 02:19:31 crc kubenswrapper[4735]: I0317 02:19:31.653985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerStarted","Data":"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85"} Mar 17 02:19:33 crc kubenswrapper[4735]: I0317 02:19:33.673497 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerID="aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85" exitCode=0 Mar 17 02:19:33 crc kubenswrapper[4735]: I0317 02:19:33.673622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerDied","Data":"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85"} Mar 17 02:19:35 crc kubenswrapper[4735]: I0317 02:19:35.693891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerStarted","Data":"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9"} Mar 17 02:19:35 crc kubenswrapper[4735]: I0317 02:19:35.725901 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mn589" podStartSLOduration=5.17219794 podStartE2EDuration="8.722594619s" podCreationTimestamp="2026-03-17 02:19:27 +0000 UTC" firstStartedPulling="2026-03-17 02:19:30.644732632 +0000 UTC m=+4196.276965620" lastFinishedPulling="2026-03-17 02:19:34.195129321 +0000 UTC m=+4199.827362299" observedRunningTime="2026-03-17 02:19:35.717219901 +0000 UTC m=+4201.349452879" watchObservedRunningTime="2026-03-17 02:19:35.722594619 +0000 UTC m=+4201.354827597" Mar 17 02:19:38 crc kubenswrapper[4735]: I0317 02:19:38.249938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:38 crc kubenswrapper[4735]: I0317 02:19:38.251239 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:39 crc kubenswrapper[4735]: I0317 02:19:39.294546 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mn589" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="registry-server" probeResult="failure" output=< Mar 17 02:19:39 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:19:39 crc kubenswrapper[4735]: > Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.606768 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.608794 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.608949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.611388 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.611976 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5" gracePeriod=600 Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.759065 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5" exitCode=0 Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.759134 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5"} Mar 17 02:19:42 crc kubenswrapper[4735]: I0317 02:19:42.759458 4735 scope.go:117] "RemoveContainer" containerID="2ceaf1a630c1bdfce59b3e8d6c5b1a32243e5d552c25af5ca94a3cbe801d0d10" Mar 17 02:19:43 crc kubenswrapper[4735]: I0317 02:19:43.768973 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486"} Mar 17 02:19:48 crc kubenswrapper[4735]: I0317 02:19:48.315926 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:48 crc kubenswrapper[4735]: I0317 02:19:48.380116 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:48 crc kubenswrapper[4735]: I0317 02:19:48.572093 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:49 crc kubenswrapper[4735]: I0317 02:19:49.832383 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mn589" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="registry-server" containerID="cri-o://5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9" gracePeriod=2 Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.687146 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.842209 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerID="5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9" exitCode=0 Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.842264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerDied","Data":"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9"} Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.842384 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn589" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.842543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn589" event={"ID":"f9bcf57c-9a17-4141-8304-0d053f6d3ed6","Type":"ContainerDied","Data":"9015939f452040e63b527a61d907f5e5b76e4d96f8c1eb0c54e9548f6b38cbb8"} Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.842560 4735 scope.go:117] "RemoveContainer" containerID="5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.865409 4735 scope.go:117] "RemoveContainer" containerID="aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.879826 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities\") pod \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.880015 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vks87\" (UniqueName: \"kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87\") pod \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.880077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content\") pod \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\" (UID: \"f9bcf57c-9a17-4141-8304-0d053f6d3ed6\") " Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.882354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities" (OuterVolumeSpecName: "utilities") pod "f9bcf57c-9a17-4141-8304-0d053f6d3ed6" (UID: "f9bcf57c-9a17-4141-8304-0d053f6d3ed6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.901417 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87" (OuterVolumeSpecName: "kube-api-access-vks87") pod "f9bcf57c-9a17-4141-8304-0d053f6d3ed6" (UID: "f9bcf57c-9a17-4141-8304-0d053f6d3ed6"). InnerVolumeSpecName "kube-api-access-vks87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.906194 4735 scope.go:117] "RemoveContainer" containerID="eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.949798 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9bcf57c-9a17-4141-8304-0d053f6d3ed6" (UID: "f9bcf57c-9a17-4141-8304-0d053f6d3ed6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.982809 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vks87\" (UniqueName: \"kubernetes.io/projected/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-kube-api-access-vks87\") on node \"crc\" DevicePath \"\"" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.982838 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.982847 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bcf57c-9a17-4141-8304-0d053f6d3ed6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.989233 4735 scope.go:117] "RemoveContainer" containerID="5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9" Mar 17 02:19:50 crc kubenswrapper[4735]: E0317 02:19:50.992829 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9\": container with ID starting with 5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9 not found: ID does not exist" containerID="5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.992938 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9"} err="failed to get container status \"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9\": rpc error: code = NotFound desc = could not find container \"5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9\": container with ID starting with 5fa9d477f4768e9738fdd0ee8c8ff10a2809476f86147dcf7c3f5413bda8e4a9 not found: ID does not exist" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.992962 4735 scope.go:117] "RemoveContainer" containerID="aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85" Mar 17 02:19:50 crc kubenswrapper[4735]: E0317 02:19:50.993355 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85\": container with ID starting with aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85 not found: ID does not exist" containerID="aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.993370 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85"} err="failed to get container status \"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85\": rpc error: code = NotFound desc = could not find container \"aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85\": container with ID starting with aff2d92bb548c727527b2cb69cd3b5f4c0afce231b6c47f133495911b40f3b85 not found: ID does not exist" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.993383 4735 scope.go:117] "RemoveContainer" containerID="eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa" Mar 17 02:19:50 crc kubenswrapper[4735]: E0317 02:19:50.993773 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa\": container with ID starting with eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa not found: ID does not exist" containerID="eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa" Mar 17 02:19:50 crc kubenswrapper[4735]: I0317 02:19:50.993800 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa"} err="failed to get container status \"eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa\": rpc error: code = NotFound desc = could not find container \"eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa\": container with ID starting with eddc9b34b9e3ccce5d5be684c5df4393a1c4d507da82704d2f35e0aee98fd6aa not found: ID does not exist" Mar 17 02:19:51 crc kubenswrapper[4735]: I0317 02:19:51.171020 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:51 crc kubenswrapper[4735]: I0317 02:19:51.184298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn589"] Mar 17 02:19:53 crc kubenswrapper[4735]: I0317 02:19:53.109178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" path="/var/lib/kubelet/pods/f9bcf57c-9a17-4141-8304-0d053f6d3ed6/volumes" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.364125 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561900-227pn"] Mar 17 02:20:00 crc kubenswrapper[4735]: E0317 02:20:00.370482 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="extract-utilities" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.370526 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="extract-utilities" Mar 17 02:20:00 crc kubenswrapper[4735]: E0317 02:20:00.370547 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="registry-server" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.370556 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="registry-server" Mar 17 02:20:00 crc kubenswrapper[4735]: E0317 02:20:00.370599 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="extract-content" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.370608 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="extract-content" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.371041 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bcf57c-9a17-4141-8304-0d053f6d3ed6" containerName="registry-server" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.374983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.429541 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.429546 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.429821 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.449071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-227pn"] Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.476529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnmk4\" (UniqueName: \"kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4\") pod \"auto-csr-approver-29561900-227pn\" (UID: \"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf\") " pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.579432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnmk4\" (UniqueName: \"kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4\") pod \"auto-csr-approver-29561900-227pn\" (UID: \"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf\") " pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.603520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnmk4\" (UniqueName: \"kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4\") pod \"auto-csr-approver-29561900-227pn\" (UID: \"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf\") " pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:00 crc kubenswrapper[4735]: I0317 02:20:00.694707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:01 crc kubenswrapper[4735]: I0317 02:20:01.439254 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-227pn"] Mar 17 02:20:01 crc kubenswrapper[4735]: W0317 02:20:01.451213 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf60b4b_f5d3_48f9_adb1_c342dbd3cbaf.slice/crio-31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c WatchSource:0}: Error finding container 31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c: Status 404 returned error can't find the container with id 31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c Mar 17 02:20:01 crc kubenswrapper[4735]: I0317 02:20:01.939217 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-227pn" event={"ID":"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf","Type":"ContainerStarted","Data":"31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c"} Mar 17 02:20:03 crc kubenswrapper[4735]: I0317 02:20:03.956324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-227pn" event={"ID":"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf","Type":"ContainerStarted","Data":"dfe9fa387a05771ff172db8b9cf2660e45664f0df8c4aa2cc607d5c0c12b009c"} Mar 17 02:20:03 crc kubenswrapper[4735]: I0317 02:20:03.980984 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561900-227pn" podStartSLOduration=2.609761218 podStartE2EDuration="3.980158159s" podCreationTimestamp="2026-03-17 02:20:00 +0000 UTC" firstStartedPulling="2026-03-17 02:20:01.453914338 +0000 UTC m=+4227.086147356" lastFinishedPulling="2026-03-17 02:20:02.824311319 +0000 UTC m=+4228.456544297" observedRunningTime="2026-03-17 02:20:03.972500777 +0000 UTC m=+4229.604733755" watchObservedRunningTime="2026-03-17 02:20:03.980158159 +0000 UTC m=+4229.612391147" Mar 17 02:20:05 crc kubenswrapper[4735]: I0317 02:20:05.973625 4735 generic.go:334] "Generic (PLEG): container finished" podID="4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" containerID="dfe9fa387a05771ff172db8b9cf2660e45664f0df8c4aa2cc607d5c0c12b009c" exitCode=0 Mar 17 02:20:05 crc kubenswrapper[4735]: I0317 02:20:05.973743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-227pn" event={"ID":"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf","Type":"ContainerDied","Data":"dfe9fa387a05771ff172db8b9cf2660e45664f0df8c4aa2cc607d5c0c12b009c"} Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.538183 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.714920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnmk4\" (UniqueName: \"kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4\") pod \"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf\" (UID: \"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf\") " Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.724110 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4" (OuterVolumeSpecName: "kube-api-access-pnmk4") pod "4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" (UID: "4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf"). InnerVolumeSpecName "kube-api-access-pnmk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.817583 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnmk4\" (UniqueName: \"kubernetes.io/projected/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf-kube-api-access-pnmk4\") on node \"crc\" DevicePath \"\"" Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.992362 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-227pn" event={"ID":"4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf","Type":"ContainerDied","Data":"31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c"} Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.992405 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ee8e54e87b2d946a0610bcd2a77f96da071f2f0207e12f441897a1c115955c" Mar 17 02:20:07 crc kubenswrapper[4735]: I0317 02:20:07.992449 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-227pn" Mar 17 02:20:08 crc kubenswrapper[4735]: I0317 02:20:08.092158 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-dkm2j"] Mar 17 02:20:08 crc kubenswrapper[4735]: I0317 02:20:08.098719 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-dkm2j"] Mar 17 02:20:09 crc kubenswrapper[4735]: I0317 02:20:09.084637 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ff207e-f79e-4d00-8d90-a069aecbbf08" path="/var/lib/kubelet/pods/57ff207e-f79e-4d00-8d90-a069aecbbf08/volumes" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.367997 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:20:57 crc kubenswrapper[4735]: E0317 02:20:57.371752 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" containerName="oc" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.371771 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" containerName="oc" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.373568 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" containerName="oc" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.379839 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.449711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.450059 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkz2\" (UniqueName: \"kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.450149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.550142 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.551010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.551066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkz2\" (UniqueName: \"kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.551116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.557370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.559756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.597052 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkz2\" (UniqueName: \"kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2\") pod \"certified-operators-k95fm\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:57 crc kubenswrapper[4735]: I0317 02:20:57.722009 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:20:59 crc kubenswrapper[4735]: I0317 02:20:59.323310 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:21:00 crc kubenswrapper[4735]: I0317 02:21:00.469600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerDied","Data":"7272dde0e32e01d8b97e446ae19124d673ad62e39ce7a3452365849873bfcd2f"} Mar 17 02:21:00 crc kubenswrapper[4735]: I0317 02:21:00.471348 4735 generic.go:334] "Generic (PLEG): container finished" podID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerID="7272dde0e32e01d8b97e446ae19124d673ad62e39ce7a3452365849873bfcd2f" exitCode=0 Mar 17 02:21:00 crc kubenswrapper[4735]: I0317 02:21:00.472067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerStarted","Data":"872b5f0d59218155d5ebe505e440e37bdf583d84022916d8d79ab40707ad85a2"} Mar 17 02:21:00 crc kubenswrapper[4735]: I0317 02:21:00.484310 4735 scope.go:117] "RemoveContainer" containerID="ee388b79d64271f7deeb9aae628d19279eb90bfae6f0a952e27471fe6efd8865" Mar 17 02:21:02 crc kubenswrapper[4735]: I0317 02:21:02.499452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerStarted","Data":"08aa070283ccced6f2869bd83c92a5980e75b5557d9cc47a9d2268ec0017af36"} Mar 17 02:21:04 crc kubenswrapper[4735]: I0317 02:21:04.520524 4735 generic.go:334] "Generic (PLEG): container finished" podID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerID="08aa070283ccced6f2869bd83c92a5980e75b5557d9cc47a9d2268ec0017af36" exitCode=0 Mar 17 02:21:04 crc kubenswrapper[4735]: I0317 02:21:04.520595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerDied","Data":"08aa070283ccced6f2869bd83c92a5980e75b5557d9cc47a9d2268ec0017af36"} Mar 17 02:21:05 crc kubenswrapper[4735]: I0317 02:21:05.533814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerStarted","Data":"49154f558e24bb167346005752a1734b3548e874cf9faac1d372dd31f035ac40"} Mar 17 02:21:05 crc kubenswrapper[4735]: I0317 02:21:05.570773 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k95fm" podStartSLOduration=3.988283783 podStartE2EDuration="8.568782813s" podCreationTimestamp="2026-03-17 02:20:57 +0000 UTC" firstStartedPulling="2026-03-17 02:21:00.471071471 +0000 UTC m=+4286.103304449" lastFinishedPulling="2026-03-17 02:21:05.051570491 +0000 UTC m=+4290.683803479" observedRunningTime="2026-03-17 02:21:05.56240161 +0000 UTC m=+4291.194634628" watchObservedRunningTime="2026-03-17 02:21:05.568782813 +0000 UTC m=+4291.201015791" Mar 17 02:21:07 crc kubenswrapper[4735]: I0317 02:21:07.723007 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:07 crc kubenswrapper[4735]: I0317 02:21:07.725014 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:08 crc kubenswrapper[4735]: I0317 02:21:08.787307 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k95fm" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" probeResult="failure" output=< Mar 17 02:21:08 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:21:08 crc kubenswrapper[4735]: > Mar 17 02:21:18 crc kubenswrapper[4735]: I0317 02:21:18.783323 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k95fm" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" probeResult="failure" output=< Mar 17 02:21:18 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:21:18 crc kubenswrapper[4735]: > Mar 17 02:21:27 crc kubenswrapper[4735]: I0317 02:21:27.819324 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:27 crc kubenswrapper[4735]: I0317 02:21:27.896029 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:28 crc kubenswrapper[4735]: I0317 02:21:28.566590 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:21:29 crc kubenswrapper[4735]: I0317 02:21:29.828141 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k95fm" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" containerID="cri-o://49154f558e24bb167346005752a1734b3548e874cf9faac1d372dd31f035ac40" gracePeriod=2 Mar 17 02:21:30 crc kubenswrapper[4735]: I0317 02:21:30.839809 4735 generic.go:334] "Generic (PLEG): container finished" podID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerID="49154f558e24bb167346005752a1734b3548e874cf9faac1d372dd31f035ac40" exitCode=0 Mar 17 02:21:30 crc kubenswrapper[4735]: I0317 02:21:30.840165 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerDied","Data":"49154f558e24bb167346005752a1734b3548e874cf9faac1d372dd31f035ac40"} Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.324693 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.506580 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities\") pod \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.506822 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkz2\" (UniqueName: \"kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2\") pod \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.506874 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content\") pod \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\" (UID: \"d644fe97-b2a1-47df-bb25-7b3356dfed6e\") " Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.509327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities" (OuterVolumeSpecName: "utilities") pod "d644fe97-b2a1-47df-bb25-7b3356dfed6e" (UID: "d644fe97-b2a1-47df-bb25-7b3356dfed6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.544342 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2" (OuterVolumeSpecName: "kube-api-access-mtkz2") pod "d644fe97-b2a1-47df-bb25-7b3356dfed6e" (UID: "d644fe97-b2a1-47df-bb25-7b3356dfed6e"). InnerVolumeSpecName "kube-api-access-mtkz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.609687 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkz2\" (UniqueName: \"kubernetes.io/projected/d644fe97-b2a1-47df-bb25-7b3356dfed6e-kube-api-access-mtkz2\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.609716 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.663772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d644fe97-b2a1-47df-bb25-7b3356dfed6e" (UID: "d644fe97-b2a1-47df-bb25-7b3356dfed6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.711372 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d644fe97-b2a1-47df-bb25-7b3356dfed6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.855559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k95fm" event={"ID":"d644fe97-b2a1-47df-bb25-7b3356dfed6e","Type":"ContainerDied","Data":"872b5f0d59218155d5ebe505e440e37bdf583d84022916d8d79ab40707ad85a2"} Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.855611 4735 scope.go:117] "RemoveContainer" containerID="49154f558e24bb167346005752a1734b3548e874cf9faac1d372dd31f035ac40" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.855615 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k95fm" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.903279 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.914381 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k95fm"] Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.927201 4735 scope.go:117] "RemoveContainer" containerID="08aa070283ccced6f2869bd83c92a5980e75b5557d9cc47a9d2268ec0017af36" Mar 17 02:21:31 crc kubenswrapper[4735]: I0317 02:21:31.960392 4735 scope.go:117] "RemoveContainer" containerID="7272dde0e32e01d8b97e446ae19124d673ad62e39ce7a3452365849873bfcd2f" Mar 17 02:21:33 crc kubenswrapper[4735]: I0317 02:21:33.086002 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" path="/var/lib/kubelet/pods/d644fe97-b2a1-47df-bb25-7b3356dfed6e/volumes" Mar 17 02:21:42 crc kubenswrapper[4735]: I0317 02:21:42.606212 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:21:42 crc kubenswrapper[4735]: I0317 02:21:42.607465 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.480930 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561902-4zb2x"] Mar 17 02:22:00 crc kubenswrapper[4735]: E0317 02:22:00.491522 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="extract-utilities" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.491568 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="extract-utilities" Mar 17 02:22:00 crc kubenswrapper[4735]: E0317 02:22:00.491596 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.491603 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" Mar 17 02:22:00 crc kubenswrapper[4735]: E0317 02:22:00.491627 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="extract-content" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.491633 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="extract-content" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.492738 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d644fe97-b2a1-47df-bb25-7b3356dfed6e" containerName="registry-server" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.497791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.511307 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.511389 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.511495 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.588302 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-4zb2x"] Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.598019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7kn\" (UniqueName: \"kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn\") pod \"auto-csr-approver-29561902-4zb2x\" (UID: \"dca12daa-c3de-4401-bdfc-32943298fa80\") " pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.699236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7kn\" (UniqueName: \"kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn\") pod \"auto-csr-approver-29561902-4zb2x\" (UID: \"dca12daa-c3de-4401-bdfc-32943298fa80\") " pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.725306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7kn\" (UniqueName: \"kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn\") pod \"auto-csr-approver-29561902-4zb2x\" (UID: \"dca12daa-c3de-4401-bdfc-32943298fa80\") " pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:00 crc kubenswrapper[4735]: I0317 02:22:00.823547 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:01 crc kubenswrapper[4735]: I0317 02:22:01.542990 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-4zb2x"] Mar 17 02:22:01 crc kubenswrapper[4735]: W0317 02:22:01.559000 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca12daa_c3de_4401_bdfc_32943298fa80.slice/crio-c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339 WatchSource:0}: Error finding container c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339: Status 404 returned error can't find the container with id c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339 Mar 17 02:22:02 crc kubenswrapper[4735]: I0317 02:22:02.197811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" event={"ID":"dca12daa-c3de-4401-bdfc-32943298fa80","Type":"ContainerStarted","Data":"c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339"} Mar 17 02:22:04 crc kubenswrapper[4735]: I0317 02:22:04.221967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" event={"ID":"dca12daa-c3de-4401-bdfc-32943298fa80","Type":"ContainerStarted","Data":"c74e0d99b51df6be877e708d716d2ea5ccbb48be320b395d742a3b25c356a053"} Mar 17 02:22:04 crc kubenswrapper[4735]: I0317 02:22:04.245925 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" podStartSLOduration=3.358877246 podStartE2EDuration="4.244680663s" podCreationTimestamp="2026-03-17 02:22:00 +0000 UTC" firstStartedPulling="2026-03-17 02:22:01.574738593 +0000 UTC m=+4347.206971571" lastFinishedPulling="2026-03-17 02:22:02.46054201 +0000 UTC m=+4348.092774988" observedRunningTime="2026-03-17 02:22:04.240632487 +0000 UTC m=+4349.872865485" watchObservedRunningTime="2026-03-17 02:22:04.244680663 +0000 UTC m=+4349.876913651" Mar 17 02:22:05 crc kubenswrapper[4735]: I0317 02:22:05.231084 4735 generic.go:334] "Generic (PLEG): container finished" podID="dca12daa-c3de-4401-bdfc-32943298fa80" containerID="c74e0d99b51df6be877e708d716d2ea5ccbb48be320b395d742a3b25c356a053" exitCode=0 Mar 17 02:22:05 crc kubenswrapper[4735]: I0317 02:22:05.231137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" event={"ID":"dca12daa-c3de-4401-bdfc-32943298fa80","Type":"ContainerDied","Data":"c74e0d99b51df6be877e708d716d2ea5ccbb48be320b395d742a3b25c356a053"} Mar 17 02:22:06 crc kubenswrapper[4735]: I0317 02:22:06.731152 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:06 crc kubenswrapper[4735]: I0317 02:22:06.819717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7kn\" (UniqueName: \"kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn\") pod \"dca12daa-c3de-4401-bdfc-32943298fa80\" (UID: \"dca12daa-c3de-4401-bdfc-32943298fa80\") " Mar 17 02:22:06 crc kubenswrapper[4735]: I0317 02:22:06.833070 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn" (OuterVolumeSpecName: "kube-api-access-xn7kn") pod "dca12daa-c3de-4401-bdfc-32943298fa80" (UID: "dca12daa-c3de-4401-bdfc-32943298fa80"). InnerVolumeSpecName "kube-api-access-xn7kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:22:06 crc kubenswrapper[4735]: I0317 02:22:06.921462 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7kn\" (UniqueName: \"kubernetes.io/projected/dca12daa-c3de-4401-bdfc-32943298fa80-kube-api-access-xn7kn\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:07 crc kubenswrapper[4735]: I0317 02:22:07.254875 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" event={"ID":"dca12daa-c3de-4401-bdfc-32943298fa80","Type":"ContainerDied","Data":"c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339"} Mar 17 02:22:07 crc kubenswrapper[4735]: I0317 02:22:07.255793 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c633cfb9f7b5967bc968d4ea75ffeb197609f718f70faf39c66f2979c4b58339" Mar 17 02:22:07 crc kubenswrapper[4735]: I0317 02:22:07.255008 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-4zb2x" Mar 17 02:22:07 crc kubenswrapper[4735]: I0317 02:22:07.352056 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-wzbkz"] Mar 17 02:22:07 crc kubenswrapper[4735]: I0317 02:22:07.361413 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-wzbkz"] Mar 17 02:22:09 crc kubenswrapper[4735]: I0317 02:22:09.084645 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47b06d0-0fa2-4911-947f-68fb142f0aba" path="/var/lib/kubelet/pods/c47b06d0-0fa2-4911-947f-68fb142f0aba/volumes" Mar 17 02:22:12 crc kubenswrapper[4735]: I0317 02:22:12.606070 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:22:12 crc kubenswrapper[4735]: I0317 02:22:12.606562 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.393424 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:22:42 crc kubenswrapper[4735]: E0317 02:22:42.395292 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca12daa-c3de-4401-bdfc-32943298fa80" containerName="oc" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.395377 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca12daa-c3de-4401-bdfc-32943298fa80" containerName="oc" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.395634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca12daa-c3de-4401-bdfc-32943298fa80" containerName="oc" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.398134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.411687 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.541024 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.541208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.541286 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvr75\" (UniqueName: \"kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.606518 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.606567 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.606603 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.607332 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.607380 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" gracePeriod=600 Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.644621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.645222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.645994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvr75\" (UniqueName: \"kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.650531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.650794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.707758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvr75\" (UniqueName: \"kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75\") pod \"community-operators-gvj99\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: I0317 02:22:42.717303 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:42 crc kubenswrapper[4735]: E0317 02:22:42.727710 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.359214 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.604822 4735 generic.go:334] "Generic (PLEG): container finished" podID="5330f494-4526-455b-92fb-4a0d690f991a" containerID="8caa40e6b0bfa060f4033be506d7ce688003c6b6c154a49823a282c8a36db5d0" exitCode=0 Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.604879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerDied","Data":"8caa40e6b0bfa060f4033be506d7ce688003c6b6c154a49823a282c8a36db5d0"} Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.605102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerStarted","Data":"d5eb590d9f76efdd8fa596d1d2e7ff6a7ed4348267300e6d6afa0baf0efd91af"} Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.610471 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" exitCode=0 Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.610514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486"} Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.611614 4735 scope.go:117] "RemoveContainer" containerID="49aff5992111da97ee2611eb416d94c771a6d6d555e803a414f033e833fe49d5" Mar 17 02:22:43 crc kubenswrapper[4735]: I0317 02:22:43.612294 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:22:43 crc kubenswrapper[4735]: E0317 02:22:43.612602 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:22:45 crc kubenswrapper[4735]: I0317 02:22:45.640142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerStarted","Data":"6b9c47bf5e41a4e0d2d5fff5a8ac74abc758a2e1151e35a0996d1c97725b4f16"} Mar 17 02:22:46 crc kubenswrapper[4735]: I0317 02:22:46.652216 4735 generic.go:334] "Generic (PLEG): container finished" podID="5330f494-4526-455b-92fb-4a0d690f991a" containerID="6b9c47bf5e41a4e0d2d5fff5a8ac74abc758a2e1151e35a0996d1c97725b4f16" exitCode=0 Mar 17 02:22:46 crc kubenswrapper[4735]: I0317 02:22:46.652290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerDied","Data":"6b9c47bf5e41a4e0d2d5fff5a8ac74abc758a2e1151e35a0996d1c97725b4f16"} Mar 17 02:22:47 crc kubenswrapper[4735]: I0317 02:22:47.663963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerStarted","Data":"48fa87c5881d66e7f0554db740ace678368dc64fb1f0f599b5a6f0a27e7ab2ce"} Mar 17 02:22:47 crc kubenswrapper[4735]: I0317 02:22:47.684133 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gvj99" podStartSLOduration=2.18600464 podStartE2EDuration="5.684116904s" podCreationTimestamp="2026-03-17 02:22:42 +0000 UTC" firstStartedPulling="2026-03-17 02:22:43.607131867 +0000 UTC m=+4389.239364845" lastFinishedPulling="2026-03-17 02:22:47.105244131 +0000 UTC m=+4392.737477109" observedRunningTime="2026-03-17 02:22:47.680928848 +0000 UTC m=+4393.313161826" watchObservedRunningTime="2026-03-17 02:22:47.684116904 +0000 UTC m=+4393.316349882" Mar 17 02:22:52 crc kubenswrapper[4735]: I0317 02:22:52.718258 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:52 crc kubenswrapper[4735]: I0317 02:22:52.719160 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:22:53 crc kubenswrapper[4735]: I0317 02:22:53.781624 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gvj99" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="registry-server" probeResult="failure" output=< Mar 17 02:22:53 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:22:53 crc kubenswrapper[4735]: > Mar 17 02:22:56 crc kubenswrapper[4735]: I0317 02:22:56.074553 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:22:56 crc kubenswrapper[4735]: E0317 02:22:56.075162 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:00 crc kubenswrapper[4735]: I0317 02:23:00.841361 4735 scope.go:117] "RemoveContainer" containerID="c6f761a685ff02965f06a78e8297ab783b2098f974f0618d857cee9c37e4f6c4" Mar 17 02:23:02 crc kubenswrapper[4735]: I0317 02:23:02.902767 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:23:02 crc kubenswrapper[4735]: I0317 02:23:02.951413 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:23:03 crc kubenswrapper[4735]: I0317 02:23:03.144904 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:23:04 crc kubenswrapper[4735]: I0317 02:23:04.820064 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gvj99" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="registry-server" containerID="cri-o://48fa87c5881d66e7f0554db740ace678368dc64fb1f0f599b5a6f0a27e7ab2ce" gracePeriod=2 Mar 17 02:23:05 crc kubenswrapper[4735]: I0317 02:23:05.830190 4735 generic.go:334] "Generic (PLEG): container finished" podID="5330f494-4526-455b-92fb-4a0d690f991a" containerID="48fa87c5881d66e7f0554db740ace678368dc64fb1f0f599b5a6f0a27e7ab2ce" exitCode=0 Mar 17 02:23:05 crc kubenswrapper[4735]: I0317 02:23:05.830228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerDied","Data":"48fa87c5881d66e7f0554db740ace678368dc64fb1f0f599b5a6f0a27e7ab2ce"} Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.323330 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.424509 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content\") pod \"5330f494-4526-455b-92fb-4a0d690f991a\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.424675 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities\") pod \"5330f494-4526-455b-92fb-4a0d690f991a\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.424722 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvr75\" (UniqueName: \"kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75\") pod \"5330f494-4526-455b-92fb-4a0d690f991a\" (UID: \"5330f494-4526-455b-92fb-4a0d690f991a\") " Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.427122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities" (OuterVolumeSpecName: "utilities") pod "5330f494-4526-455b-92fb-4a0d690f991a" (UID: "5330f494-4526-455b-92fb-4a0d690f991a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.465137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75" (OuterVolumeSpecName: "kube-api-access-vvr75") pod "5330f494-4526-455b-92fb-4a0d690f991a" (UID: "5330f494-4526-455b-92fb-4a0d690f991a"). InnerVolumeSpecName "kube-api-access-vvr75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.497127 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5330f494-4526-455b-92fb-4a0d690f991a" (UID: "5330f494-4526-455b-92fb-4a0d690f991a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.527287 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.527606 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5330f494-4526-455b-92fb-4a0d690f991a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.527620 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvr75\" (UniqueName: \"kubernetes.io/projected/5330f494-4526-455b-92fb-4a0d690f991a-kube-api-access-vvr75\") on node \"crc\" DevicePath \"\"" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.850953 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvj99" event={"ID":"5330f494-4526-455b-92fb-4a0d690f991a","Type":"ContainerDied","Data":"d5eb590d9f76efdd8fa596d1d2e7ff6a7ed4348267300e6d6afa0baf0efd91af"} Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.851024 4735 scope.go:117] "RemoveContainer" containerID="48fa87c5881d66e7f0554db740ace678368dc64fb1f0f599b5a6f0a27e7ab2ce" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.851228 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvj99" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.898067 4735 scope.go:117] "RemoveContainer" containerID="6b9c47bf5e41a4e0d2d5fff5a8ac74abc758a2e1151e35a0996d1c97725b4f16" Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.898323 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.910609 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gvj99"] Mar 17 02:23:06 crc kubenswrapper[4735]: I0317 02:23:06.917995 4735 scope.go:117] "RemoveContainer" containerID="8caa40e6b0bfa060f4033be506d7ce688003c6b6c154a49823a282c8a36db5d0" Mar 17 02:23:07 crc kubenswrapper[4735]: I0317 02:23:07.083254 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5330f494-4526-455b-92fb-4a0d690f991a" path="/var/lib/kubelet/pods/5330f494-4526-455b-92fb-4a0d690f991a/volumes" Mar 17 02:23:09 crc kubenswrapper[4735]: I0317 02:23:09.073119 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:23:09 crc kubenswrapper[4735]: E0317 02:23:09.074785 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:23 crc kubenswrapper[4735]: I0317 02:23:23.078891 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:23:23 crc kubenswrapper[4735]: E0317 02:23:23.079540 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:34 crc kubenswrapper[4735]: I0317 02:23:34.073227 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:23:34 crc kubenswrapper[4735]: E0317 02:23:34.074033 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:46 crc kubenswrapper[4735]: I0317 02:23:46.073220 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:23:46 crc kubenswrapper[4735]: E0317 02:23:46.074970 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.584937 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:23:57 crc kubenswrapper[4735]: E0317 02:23:57.603047 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="registry-server" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.603096 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="registry-server" Mar 17 02:23:57 crc kubenswrapper[4735]: E0317 02:23:57.603116 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="extract-utilities" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.603125 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="extract-utilities" Mar 17 02:23:57 crc kubenswrapper[4735]: E0317 02:23:57.603174 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="extract-content" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.603183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="extract-content" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.604366 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5330f494-4526-455b-92fb-4a0d690f991a" containerName="registry-server" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.610287 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.689279 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.746442 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.746528 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crbs2\" (UniqueName: \"kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.746824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.848935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.849069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.849105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crbs2\" (UniqueName: \"kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.853873 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.855000 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.885605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crbs2\" (UniqueName: \"kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2\") pod \"redhat-operators-7sjlf\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:57 crc kubenswrapper[4735]: I0317 02:23:57.937431 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:23:58 crc kubenswrapper[4735]: I0317 02:23:58.073567 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:23:58 crc kubenswrapper[4735]: E0317 02:23:58.074214 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:23:59 crc kubenswrapper[4735]: I0317 02:23:59.181500 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:23:59 crc kubenswrapper[4735]: I0317 02:23:59.919088 4735 generic.go:334] "Generic (PLEG): container finished" podID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerID="460c0d4bc830eedc8a63d85f3157f20719f852eee4a24e03913dc6f63817bfbc" exitCode=0 Mar 17 02:23:59 crc kubenswrapper[4735]: I0317 02:23:59.919195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerDied","Data":"460c0d4bc830eedc8a63d85f3157f20719f852eee4a24e03913dc6f63817bfbc"} Mar 17 02:23:59 crc kubenswrapper[4735]: I0317 02:23:59.920073 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerStarted","Data":"315d22a710926f36e4dc9c1347361fccf38c1a1800f187347e5c723fe680dd72"} Mar 17 02:23:59 crc kubenswrapper[4735]: I0317 02:23:59.932604 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.267258 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561904-x4n49"] Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.268741 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.335339 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-x4n49"] Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.337660 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.338095 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.353728 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.430706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rlqc\" (UniqueName: \"kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc\") pod \"auto-csr-approver-29561904-x4n49\" (UID: \"666386c3-46df-4527-9e07-28c12fe09982\") " pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.532685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rlqc\" (UniqueName: \"kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc\") pod \"auto-csr-approver-29561904-x4n49\" (UID: \"666386c3-46df-4527-9e07-28c12fe09982\") " pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.562042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rlqc\" (UniqueName: \"kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc\") pod \"auto-csr-approver-29561904-x4n49\" (UID: \"666386c3-46df-4527-9e07-28c12fe09982\") " pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:00 crc kubenswrapper[4735]: I0317 02:24:00.653695 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:01 crc kubenswrapper[4735]: I0317 02:24:01.220625 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-x4n49"] Mar 17 02:24:01 crc kubenswrapper[4735]: W0317 02:24:01.427012 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod666386c3_46df_4527_9e07_28c12fe09982.slice/crio-ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb WatchSource:0}: Error finding container ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb: Status 404 returned error can't find the container with id ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb Mar 17 02:24:01 crc kubenswrapper[4735]: I0317 02:24:01.948739 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-x4n49" event={"ID":"666386c3-46df-4527-9e07-28c12fe09982","Type":"ContainerStarted","Data":"ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb"} Mar 17 02:24:01 crc kubenswrapper[4735]: I0317 02:24:01.951835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerStarted","Data":"a3eefe3cabc0b757d21f08f8f37cb7bbcc5b2d916bd0dc16f0b78060a4b876c4"} Mar 17 02:24:03 crc kubenswrapper[4735]: I0317 02:24:03.971416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-x4n49" event={"ID":"666386c3-46df-4527-9e07-28c12fe09982","Type":"ContainerStarted","Data":"d51c66507a14187c20c31775a9f95d3d36bb1cdb3f11ab2f187b4af45e58ff40"} Mar 17 02:24:04 crc kubenswrapper[4735]: I0317 02:24:04.002894 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561904-x4n49" podStartSLOduration=2.913597121 podStartE2EDuration="3.999310328s" podCreationTimestamp="2026-03-17 02:24:00 +0000 UTC" firstStartedPulling="2026-03-17 02:24:01.429724918 +0000 UTC m=+4467.061957936" lastFinishedPulling="2026-03-17 02:24:02.515438165 +0000 UTC m=+4468.147671143" observedRunningTime="2026-03-17 02:24:03.995620649 +0000 UTC m=+4469.627853627" watchObservedRunningTime="2026-03-17 02:24:03.999310328 +0000 UTC m=+4469.631543306" Mar 17 02:24:05 crc kubenswrapper[4735]: I0317 02:24:05.989404 4735 generic.go:334] "Generic (PLEG): container finished" podID="666386c3-46df-4527-9e07-28c12fe09982" containerID="d51c66507a14187c20c31775a9f95d3d36bb1cdb3f11ab2f187b4af45e58ff40" exitCode=0 Mar 17 02:24:05 crc kubenswrapper[4735]: I0317 02:24:05.989466 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-x4n49" event={"ID":"666386c3-46df-4527-9e07-28c12fe09982","Type":"ContainerDied","Data":"d51c66507a14187c20c31775a9f95d3d36bb1cdb3f11ab2f187b4af45e58ff40"} Mar 17 02:24:07 crc kubenswrapper[4735]: I0317 02:24:07.644372 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:07 crc kubenswrapper[4735]: I0317 02:24:07.792209 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rlqc\" (UniqueName: \"kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc\") pod \"666386c3-46df-4527-9e07-28c12fe09982\" (UID: \"666386c3-46df-4527-9e07-28c12fe09982\") " Mar 17 02:24:07 crc kubenswrapper[4735]: I0317 02:24:07.854977 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc" (OuterVolumeSpecName: "kube-api-access-9rlqc") pod "666386c3-46df-4527-9e07-28c12fe09982" (UID: "666386c3-46df-4527-9e07-28c12fe09982"). InnerVolumeSpecName "kube-api-access-9rlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:24:07 crc kubenswrapper[4735]: I0317 02:24:07.895192 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rlqc\" (UniqueName: \"kubernetes.io/projected/666386c3-46df-4527-9e07-28c12fe09982-kube-api-access-9rlqc\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.019804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-x4n49" event={"ID":"666386c3-46df-4527-9e07-28c12fe09982","Type":"ContainerDied","Data":"ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb"} Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.020272 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab076deed3b7067096308ba6d696858ecc86b4483eeb82329a9e7a91bf1045fb" Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.020383 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-x4n49" Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.027459 4735 generic.go:334] "Generic (PLEG): container finished" podID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerID="a3eefe3cabc0b757d21f08f8f37cb7bbcc5b2d916bd0dc16f0b78060a4b876c4" exitCode=0 Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.027494 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerDied","Data":"a3eefe3cabc0b757d21f08f8f37cb7bbcc5b2d916bd0dc16f0b78060a4b876c4"} Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.766724 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-p9d4m"] Mar 17 02:24:08 crc kubenswrapper[4735]: I0317 02:24:08.776093 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-p9d4m"] Mar 17 02:24:09 crc kubenswrapper[4735]: I0317 02:24:09.036835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerStarted","Data":"9604de80836e180d23567f67c05986900d51d7eeec322f38d94a31e0bc89d3db"} Mar 17 02:24:09 crc kubenswrapper[4735]: I0317 02:24:09.082921 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e623a3-6185-4391-a462-a08bebec09a3" path="/var/lib/kubelet/pods/c0e623a3-6185-4391-a462-a08bebec09a3/volumes" Mar 17 02:24:10 crc kubenswrapper[4735]: I0317 02:24:10.073964 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:24:10 crc kubenswrapper[4735]: E0317 02:24:10.074496 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:24:17 crc kubenswrapper[4735]: I0317 02:24:17.938264 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:24:17 crc kubenswrapper[4735]: I0317 02:24:17.938714 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:24:19 crc kubenswrapper[4735]: I0317 02:24:19.024408 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" probeResult="failure" output=< Mar 17 02:24:19 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:24:19 crc kubenswrapper[4735]: > Mar 17 02:24:25 crc kubenswrapper[4735]: I0317 02:24:25.073348 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:24:25 crc kubenswrapper[4735]: E0317 02:24:25.074170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:24:29 crc kubenswrapper[4735]: I0317 02:24:29.011595 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" probeResult="failure" output=< Mar 17 02:24:29 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:24:29 crc kubenswrapper[4735]: > Mar 17 02:24:36 crc kubenswrapper[4735]: I0317 02:24:36.074703 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:24:36 crc kubenswrapper[4735]: E0317 02:24:36.075821 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:24:38 crc kubenswrapper[4735]: I0317 02:24:38.987415 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" probeResult="failure" output=< Mar 17 02:24:38 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:24:38 crc kubenswrapper[4735]: > Mar 17 02:24:47 crc kubenswrapper[4735]: I0317 02:24:47.074090 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:24:47 crc kubenswrapper[4735]: E0317 02:24:47.075090 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:24:49 crc kubenswrapper[4735]: I0317 02:24:49.071492 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" probeResult="failure" output=< Mar 17 02:24:49 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:24:49 crc kubenswrapper[4735]: > Mar 17 02:24:59 crc kubenswrapper[4735]: I0317 02:24:59.009370 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" probeResult="failure" output=< Mar 17 02:24:59 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:24:59 crc kubenswrapper[4735]: > Mar 17 02:25:01 crc kubenswrapper[4735]: I0317 02:25:01.073987 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:25:01 crc kubenswrapper[4735]: E0317 02:25:01.074773 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:25:01 crc kubenswrapper[4735]: I0317 02:25:01.163588 4735 scope.go:117] "RemoveContainer" containerID="fcac3afcc18af5afb35add028c77e4e7088d310edd7c000a8a1120cb6471c56a" Mar 17 02:25:08 crc kubenswrapper[4735]: I0317 02:25:08.009529 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:25:08 crc kubenswrapper[4735]: I0317 02:25:08.088226 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:25:08 crc kubenswrapper[4735]: I0317 02:25:08.122336 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7sjlf" podStartSLOduration=62.571232176 podStartE2EDuration="1m11.117117911s" podCreationTimestamp="2026-03-17 02:23:57 +0000 UTC" firstStartedPulling="2026-03-17 02:23:59.92134231 +0000 UTC m=+4465.553575338" lastFinishedPulling="2026-03-17 02:24:08.467228085 +0000 UTC m=+4474.099461073" observedRunningTime="2026-03-17 02:24:09.063921535 +0000 UTC m=+4474.696154543" watchObservedRunningTime="2026-03-17 02:25:08.117117911 +0000 UTC m=+4533.749350909" Mar 17 02:25:08 crc kubenswrapper[4735]: I0317 02:25:08.342464 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:25:09 crc kubenswrapper[4735]: I0317 02:25:09.587029 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7sjlf" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" containerID="cri-o://9604de80836e180d23567f67c05986900d51d7eeec322f38d94a31e0bc89d3db" gracePeriod=2 Mar 17 02:25:10 crc kubenswrapper[4735]: I0317 02:25:10.596390 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerDied","Data":"9604de80836e180d23567f67c05986900d51d7eeec322f38d94a31e0bc89d3db"} Mar 17 02:25:10 crc kubenswrapper[4735]: I0317 02:25:10.597162 4735 generic.go:334] "Generic (PLEG): container finished" podID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerID="9604de80836e180d23567f67c05986900d51d7eeec322f38d94a31e0bc89d3db" exitCode=0 Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.098639 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.270008 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crbs2\" (UniqueName: \"kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2\") pod \"c6b2d232-943f-4d20-9498-9bd7e4982aca\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.270065 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities\") pod \"c6b2d232-943f-4d20-9498-9bd7e4982aca\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.270160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content\") pod \"c6b2d232-943f-4d20-9498-9bd7e4982aca\" (UID: \"c6b2d232-943f-4d20-9498-9bd7e4982aca\") " Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.277161 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities" (OuterVolumeSpecName: "utilities") pod "c6b2d232-943f-4d20-9498-9bd7e4982aca" (UID: "c6b2d232-943f-4d20-9498-9bd7e4982aca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.306447 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2" (OuterVolumeSpecName: "kube-api-access-crbs2") pod "c6b2d232-943f-4d20-9498-9bd7e4982aca" (UID: "c6b2d232-943f-4d20-9498-9bd7e4982aca"). InnerVolumeSpecName "kube-api-access-crbs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.372026 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crbs2\" (UniqueName: \"kubernetes.io/projected/c6b2d232-943f-4d20-9498-9bd7e4982aca-kube-api-access-crbs2\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.372072 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.469467 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6b2d232-943f-4d20-9498-9bd7e4982aca" (UID: "c6b2d232-943f-4d20-9498-9bd7e4982aca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.474098 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b2d232-943f-4d20-9498-9bd7e4982aca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.613456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sjlf" event={"ID":"c6b2d232-943f-4d20-9498-9bd7e4982aca","Type":"ContainerDied","Data":"315d22a710926f36e4dc9c1347361fccf38c1a1800f187347e5c723fe680dd72"} Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.613886 4735 scope.go:117] "RemoveContainer" containerID="9604de80836e180d23567f67c05986900d51d7eeec322f38d94a31e0bc89d3db" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.613746 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sjlf" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.682964 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.700622 4735 scope.go:117] "RemoveContainer" containerID="a3eefe3cabc0b757d21f08f8f37cb7bbcc5b2d916bd0dc16f0b78060a4b876c4" Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.711142 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7sjlf"] Mar 17 02:25:11 crc kubenswrapper[4735]: I0317 02:25:11.736192 4735 scope.go:117] "RemoveContainer" containerID="460c0d4bc830eedc8a63d85f3157f20719f852eee4a24e03913dc6f63817bfbc" Mar 17 02:25:13 crc kubenswrapper[4735]: I0317 02:25:13.094717 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" path="/var/lib/kubelet/pods/c6b2d232-943f-4d20-9498-9bd7e4982aca/volumes" Mar 17 02:25:16 crc kubenswrapper[4735]: I0317 02:25:16.073561 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:25:16 crc kubenswrapper[4735]: E0317 02:25:16.074084 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:25:27 crc kubenswrapper[4735]: I0317 02:25:27.073633 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:25:27 crc kubenswrapper[4735]: E0317 02:25:27.074329 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:25:40 crc kubenswrapper[4735]: I0317 02:25:40.073536 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:25:40 crc kubenswrapper[4735]: E0317 02:25:40.074747 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:25:51 crc kubenswrapper[4735]: I0317 02:25:51.074889 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:25:51 crc kubenswrapper[4735]: E0317 02:25:51.075937 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.219233 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561906-r59ns"] Mar 17 02:26:00 crc kubenswrapper[4735]: E0317 02:26:00.223306 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666386c3-46df-4527-9e07-28c12fe09982" containerName="oc" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.223762 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="666386c3-46df-4527-9e07-28c12fe09982" containerName="oc" Mar 17 02:26:00 crc kubenswrapper[4735]: E0317 02:26:00.223811 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="extract-utilities" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.223818 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="extract-utilities" Mar 17 02:26:00 crc kubenswrapper[4735]: E0317 02:26:00.223835 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.223842 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4735]: E0317 02:26:00.223898 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="extract-content" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.223905 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="extract-content" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.225164 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b2d232-943f-4d20-9498-9bd7e4982aca" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.225192 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="666386c3-46df-4527-9e07-28c12fe09982" containerName="oc" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.230127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.244992 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.244997 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.246114 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.295366 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-r59ns"] Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.373294 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwq2\" (UniqueName: \"kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2\") pod \"auto-csr-approver-29561906-r59ns\" (UID: \"61fb0b2f-feee-420b-b167-426a549dacfc\") " pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.475243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwq2\" (UniqueName: \"kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2\") pod \"auto-csr-approver-29561906-r59ns\" (UID: \"61fb0b2f-feee-420b-b167-426a549dacfc\") " pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.505895 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwq2\" (UniqueName: \"kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2\") pod \"auto-csr-approver-29561906-r59ns\" (UID: \"61fb0b2f-feee-420b-b167-426a549dacfc\") " pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:00 crc kubenswrapper[4735]: I0317 02:26:00.572268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:01 crc kubenswrapper[4735]: I0317 02:26:01.416128 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-r59ns"] Mar 17 02:26:01 crc kubenswrapper[4735]: W0317 02:26:01.484665 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61fb0b2f_feee_420b_b167_426a549dacfc.slice/crio-75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0 WatchSource:0}: Error finding container 75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0: Status 404 returned error can't find the container with id 75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0 Mar 17 02:26:02 crc kubenswrapper[4735]: I0317 02:26:02.094116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-r59ns" event={"ID":"61fb0b2f-feee-420b-b167-426a549dacfc","Type":"ContainerStarted","Data":"75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0"} Mar 17 02:26:04 crc kubenswrapper[4735]: I0317 02:26:04.150209 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-r59ns" event={"ID":"61fb0b2f-feee-420b-b167-426a549dacfc","Type":"ContainerStarted","Data":"6b7e4e5e2c9526727ce32b2b198c1b48b3d21e719b8b6f4cd54f8c0639d53942"} Mar 17 02:26:05 crc kubenswrapper[4735]: I0317 02:26:05.082805 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:26:05 crc kubenswrapper[4735]: E0317 02:26:05.083465 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:26:05 crc kubenswrapper[4735]: I0317 02:26:05.160624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-r59ns" event={"ID":"61fb0b2f-feee-420b-b167-426a549dacfc","Type":"ContainerDied","Data":"6b7e4e5e2c9526727ce32b2b198c1b48b3d21e719b8b6f4cd54f8c0639d53942"} Mar 17 02:26:05 crc kubenswrapper[4735]: I0317 02:26:05.160504 4735 generic.go:334] "Generic (PLEG): container finished" podID="61fb0b2f-feee-420b-b167-426a549dacfc" containerID="6b7e4e5e2c9526727ce32b2b198c1b48b3d21e719b8b6f4cd54f8c0639d53942" exitCode=0 Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.038173 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.146060 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kwq2\" (UniqueName: \"kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2\") pod \"61fb0b2f-feee-420b-b167-426a549dacfc\" (UID: \"61fb0b2f-feee-420b-b167-426a549dacfc\") " Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.163307 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2" (OuterVolumeSpecName: "kube-api-access-5kwq2") pod "61fb0b2f-feee-420b-b167-426a549dacfc" (UID: "61fb0b2f-feee-420b-b167-426a549dacfc"). InnerVolumeSpecName "kube-api-access-5kwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.180893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-r59ns" event={"ID":"61fb0b2f-feee-420b-b167-426a549dacfc","Type":"ContainerDied","Data":"75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0"} Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.180941 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-r59ns" Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.181957 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d627ab31551b05ec24a397b3f4aa2c58e2a700670a40ff89b20e0ba0573fb0" Mar 17 02:26:07 crc kubenswrapper[4735]: I0317 02:26:07.249005 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kwq2\" (UniqueName: \"kubernetes.io/projected/61fb0b2f-feee-420b-b167-426a549dacfc-kube-api-access-5kwq2\") on node \"crc\" DevicePath \"\"" Mar 17 02:26:08 crc kubenswrapper[4735]: I0317 02:26:08.145909 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-227pn"] Mar 17 02:26:08 crc kubenswrapper[4735]: I0317 02:26:08.159153 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-227pn"] Mar 17 02:26:09 crc kubenswrapper[4735]: I0317 02:26:09.083746 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf" path="/var/lib/kubelet/pods/4cf60b4b-f5d3-48f9-adb1-c342dbd3cbaf/volumes" Mar 17 02:26:17 crc kubenswrapper[4735]: I0317 02:26:17.072892 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:26:17 crc kubenswrapper[4735]: E0317 02:26:17.073498 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:26:28 crc kubenswrapper[4735]: I0317 02:26:28.073073 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:26:28 crc kubenswrapper[4735]: E0317 02:26:28.073827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:26:39 crc kubenswrapper[4735]: I0317 02:26:39.073706 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:26:39 crc kubenswrapper[4735]: E0317 02:26:39.075617 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:26:54 crc kubenswrapper[4735]: I0317 02:26:54.074250 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:26:54 crc kubenswrapper[4735]: E0317 02:26:54.074950 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:27:01 crc kubenswrapper[4735]: I0317 02:27:01.596619 4735 scope.go:117] "RemoveContainer" containerID="dfe9fa387a05771ff172db8b9cf2660e45664f0df8c4aa2cc607d5c0c12b009c" Mar 17 02:27:06 crc kubenswrapper[4735]: I0317 02:27:06.073746 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:27:06 crc kubenswrapper[4735]: E0317 02:27:06.074545 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:27:20 crc kubenswrapper[4735]: I0317 02:27:20.073627 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:27:20 crc kubenswrapper[4735]: E0317 02:27:20.074485 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:27:32 crc kubenswrapper[4735]: I0317 02:27:32.074157 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:27:32 crc kubenswrapper[4735]: E0317 02:27:32.075138 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:27:46 crc kubenswrapper[4735]: I0317 02:27:46.073435 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:27:47 crc kubenswrapper[4735]: I0317 02:27:47.099774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436"} Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.406772 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561908-4nzx5"] Mar 17 02:28:00 crc kubenswrapper[4735]: E0317 02:28:00.412990 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fb0b2f-feee-420b-b167-426a549dacfc" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.414569 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fb0b2f-feee-420b-b167-426a549dacfc" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.415903 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fb0b2f-feee-420b-b167-426a549dacfc" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.422915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.444884 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.444882 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.444910 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.503032 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-4nzx5"] Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.573098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h7k4\" (UniqueName: \"kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4\") pod \"auto-csr-approver-29561908-4nzx5\" (UID: \"43fc6d4b-0c5d-40db-8c5c-844a377493c9\") " pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.675005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h7k4\" (UniqueName: \"kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4\") pod \"auto-csr-approver-29561908-4nzx5\" (UID: \"43fc6d4b-0c5d-40db-8c5c-844a377493c9\") " pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.710158 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h7k4\" (UniqueName: \"kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4\") pod \"auto-csr-approver-29561908-4nzx5\" (UID: \"43fc6d4b-0c5d-40db-8c5c-844a377493c9\") " pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:00 crc kubenswrapper[4735]: I0317 02:28:00.777003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:02 crc kubenswrapper[4735]: I0317 02:28:02.259980 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-4nzx5"] Mar 17 02:28:03 crc kubenswrapper[4735]: I0317 02:28:03.234994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" event={"ID":"43fc6d4b-0c5d-40db-8c5c-844a377493c9","Type":"ContainerStarted","Data":"2455e0be000e68a323c883c48564fc3a4b8f21581c88d6cb3d3cd9fe23e8c239"} Mar 17 02:28:05 crc kubenswrapper[4735]: I0317 02:28:05.254255 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" event={"ID":"43fc6d4b-0c5d-40db-8c5c-844a377493c9","Type":"ContainerStarted","Data":"ac49f3dbfc1c2f8e1f97bc550f97aedd2f1c0f9ac82e6bf406f4e906d32b4258"} Mar 17 02:28:05 crc kubenswrapper[4735]: I0317 02:28:05.279158 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" podStartSLOduration=3.623524955 podStartE2EDuration="5.277964194s" podCreationTimestamp="2026-03-17 02:28:00 +0000 UTC" firstStartedPulling="2026-03-17 02:28:02.302653698 +0000 UTC m=+4707.934886676" lastFinishedPulling="2026-03-17 02:28:03.957092937 +0000 UTC m=+4709.589325915" observedRunningTime="2026-03-17 02:28:05.272427861 +0000 UTC m=+4710.904660839" watchObservedRunningTime="2026-03-17 02:28:05.277964194 +0000 UTC m=+4710.910197182" Mar 17 02:28:06 crc kubenswrapper[4735]: I0317 02:28:06.264381 4735 generic.go:334] "Generic (PLEG): container finished" podID="43fc6d4b-0c5d-40db-8c5c-844a377493c9" containerID="ac49f3dbfc1c2f8e1f97bc550f97aedd2f1c0f9ac82e6bf406f4e906d32b4258" exitCode=0 Mar 17 02:28:06 crc kubenswrapper[4735]: I0317 02:28:06.264434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" event={"ID":"43fc6d4b-0c5d-40db-8c5c-844a377493c9","Type":"ContainerDied","Data":"ac49f3dbfc1c2f8e1f97bc550f97aedd2f1c0f9ac82e6bf406f4e906d32b4258"} Mar 17 02:28:07 crc kubenswrapper[4735]: I0317 02:28:07.797345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:07 crc kubenswrapper[4735]: I0317 02:28:07.930055 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h7k4\" (UniqueName: \"kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4\") pod \"43fc6d4b-0c5d-40db-8c5c-844a377493c9\" (UID: \"43fc6d4b-0c5d-40db-8c5c-844a377493c9\") " Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.262187 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-4zb2x"] Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.271278 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-4zb2x"] Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.282668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" event={"ID":"43fc6d4b-0c5d-40db-8c5c-844a377493c9","Type":"ContainerDied","Data":"2455e0be000e68a323c883c48564fc3a4b8f21581c88d6cb3d3cd9fe23e8c239"} Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.282746 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-4nzx5" Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.284226 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2455e0be000e68a323c883c48564fc3a4b8f21581c88d6cb3d3cd9fe23e8c239" Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.319415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4" (OuterVolumeSpecName: "kube-api-access-8h7k4") pod "43fc6d4b-0c5d-40db-8c5c-844a377493c9" (UID: "43fc6d4b-0c5d-40db-8c5c-844a377493c9"). InnerVolumeSpecName "kube-api-access-8h7k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:28:08 crc kubenswrapper[4735]: I0317 02:28:08.339005 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h7k4\" (UniqueName: \"kubernetes.io/projected/43fc6d4b-0c5d-40db-8c5c-844a377493c9-kube-api-access-8h7k4\") on node \"crc\" DevicePath \"\"" Mar 17 02:28:09 crc kubenswrapper[4735]: I0317 02:28:09.083629 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca12daa-c3de-4401-bdfc-32943298fa80" path="/var/lib/kubelet/pods/dca12daa-c3de-4401-bdfc-32943298fa80/volumes" Mar 17 02:29:01 crc kubenswrapper[4735]: I0317 02:29:01.978915 4735 scope.go:117] "RemoveContainer" containerID="c74e0d99b51df6be877e708d716d2ea5ccbb48be320b395d742a3b25c356a053" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.303899 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:29:55 crc kubenswrapper[4735]: E0317 02:29:55.308338 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fc6d4b-0c5d-40db-8c5c-844a377493c9" containerName="oc" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.308368 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc6d4b-0c5d-40db-8c5c-844a377493c9" containerName="oc" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.310331 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fc6d4b-0c5d-40db-8c5c-844a377493c9" containerName="oc" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.318589 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.384013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px9gk\" (UniqueName: \"kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.384536 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.384921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.433995 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.486713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.486828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px9gk\" (UniqueName: \"kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.486887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.487575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.487653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.508406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px9gk\" (UniqueName: \"kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk\") pod \"redhat-marketplace-smcgw\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:55 crc kubenswrapper[4735]: I0317 02:29:55.656989 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:29:57 crc kubenswrapper[4735]: I0317 02:29:57.086241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:29:57 crc kubenswrapper[4735]: I0317 02:29:57.485954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerDied","Data":"fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe"} Mar 17 02:29:57 crc kubenswrapper[4735]: I0317 02:29:57.487150 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerID="fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe" exitCode=0 Mar 17 02:29:57 crc kubenswrapper[4735]: I0317 02:29:57.487465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerStarted","Data":"7d506ef01e068e9134bc9758bc84d6eb15193784f210556c760e801e9649b3f0"} Mar 17 02:29:57 crc kubenswrapper[4735]: I0317 02:29:57.491415 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:29:58 crc kubenswrapper[4735]: I0317 02:29:58.495974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerStarted","Data":"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b"} Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.261807 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561910-nm5p8"] Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.266218 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.271876 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn"] Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.274480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.281064 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.281065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.281155 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.281072 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.290318 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.290616 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn"] Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.381757 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-nm5p8"] Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.385078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kzqt\" (UniqueName: \"kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.385464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.385611 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ncbx\" (UniqueName: \"kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx\") pod \"auto-csr-approver-29561910-nm5p8\" (UID: \"16168670-2e45-49f9-81bc-9127b13dfb5e\") " pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.385716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.487954 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.488033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ncbx\" (UniqueName: \"kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx\") pod \"auto-csr-approver-29561910-nm5p8\" (UID: \"16168670-2e45-49f9-81bc-9127b13dfb5e\") " pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.488081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.488128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kzqt\" (UniqueName: \"kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.492680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.506574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.508050 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ncbx\" (UniqueName: \"kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx\") pod \"auto-csr-approver-29561910-nm5p8\" (UID: \"16168670-2e45-49f9-81bc-9127b13dfb5e\") " pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.511095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kzqt\" (UniqueName: \"kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt\") pod \"collect-profiles-29561910-dkgcn\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.520762 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerID="add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b" exitCode=0 Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.520801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerDied","Data":"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b"} Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.595531 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:00 crc kubenswrapper[4735]: I0317 02:30:00.613283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.056899 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-nm5p8"] Mar 17 02:30:01 crc kubenswrapper[4735]: W0317 02:30:01.135627 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7241f2_ed80_4f1d_8c72_56826fd49958.slice/crio-245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2 WatchSource:0}: Error finding container 245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2: Status 404 returned error can't find the container with id 245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2 Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.135834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn"] Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.530275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerStarted","Data":"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca"} Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.532722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" event={"ID":"16168670-2e45-49f9-81bc-9127b13dfb5e","Type":"ContainerStarted","Data":"eb01a77b7707a38371eeb4211df5a80b67843b2a920dca587c53b77b443a6a85"} Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.537171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" event={"ID":"7a7241f2-ed80-4f1d-8c72-56826fd49958","Type":"ContainerStarted","Data":"642f1d5f3d65cc61c1a39810025ecdd6d13738f5f1be49c48905f0bddd9b490e"} Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.537235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" event={"ID":"7a7241f2-ed80-4f1d-8c72-56826fd49958","Type":"ContainerStarted","Data":"245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2"} Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.561452 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-smcgw" podStartSLOduration=2.851965191 podStartE2EDuration="6.557333104s" podCreationTimestamp="2026-03-17 02:29:55 +0000 UTC" firstStartedPulling="2026-03-17 02:29:57.487350607 +0000 UTC m=+4823.119583585" lastFinishedPulling="2026-03-17 02:30:01.19271852 +0000 UTC m=+4826.824951498" observedRunningTime="2026-03-17 02:30:01.548749018 +0000 UTC m=+4827.180981996" watchObservedRunningTime="2026-03-17 02:30:01.557333104 +0000 UTC m=+4827.189566082" Mar 17 02:30:01 crc kubenswrapper[4735]: I0317 02:30:01.571417 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" podStartSLOduration=1.5714011700000001 podStartE2EDuration="1.57140117s" podCreationTimestamp="2026-03-17 02:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:30:01.5668294 +0000 UTC m=+4827.199062378" watchObservedRunningTime="2026-03-17 02:30:01.57140117 +0000 UTC m=+4827.203634148" Mar 17 02:30:02 crc kubenswrapper[4735]: I0317 02:30:02.546019 4735 generic.go:334] "Generic (PLEG): container finished" podID="7a7241f2-ed80-4f1d-8c72-56826fd49958" containerID="642f1d5f3d65cc61c1a39810025ecdd6d13738f5f1be49c48905f0bddd9b490e" exitCode=0 Mar 17 02:30:02 crc kubenswrapper[4735]: I0317 02:30:02.546143 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" event={"ID":"7a7241f2-ed80-4f1d-8c72-56826fd49958","Type":"ContainerDied","Data":"642f1d5f3d65cc61c1a39810025ecdd6d13738f5f1be49c48905f0bddd9b490e"} Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.016112 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.173054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kzqt\" (UniqueName: \"kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt\") pod \"7a7241f2-ed80-4f1d-8c72-56826fd49958\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.173264 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume\") pod \"7a7241f2-ed80-4f1d-8c72-56826fd49958\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.173327 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume\") pod \"7a7241f2-ed80-4f1d-8c72-56826fd49958\" (UID: \"7a7241f2-ed80-4f1d-8c72-56826fd49958\") " Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.176573 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a7241f2-ed80-4f1d-8c72-56826fd49958" (UID: "7a7241f2-ed80-4f1d-8c72-56826fd49958"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.183942 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a7241f2-ed80-4f1d-8c72-56826fd49958" (UID: "7a7241f2-ed80-4f1d-8c72-56826fd49958"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.190529 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt" (OuterVolumeSpecName: "kube-api-access-4kzqt") pod "7a7241f2-ed80-4f1d-8c72-56826fd49958" (UID: "7a7241f2-ed80-4f1d-8c72-56826fd49958"). InnerVolumeSpecName "kube-api-access-4kzqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.278276 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kzqt\" (UniqueName: \"kubernetes.io/projected/7a7241f2-ed80-4f1d-8c72-56826fd49958-kube-api-access-4kzqt\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.278317 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7241f2-ed80-4f1d-8c72-56826fd49958-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.278327 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7241f2-ed80-4f1d-8c72-56826fd49958-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.562258 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" event={"ID":"7a7241f2-ed80-4f1d-8c72-56826fd49958","Type":"ContainerDied","Data":"245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2"} Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.562300 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245d03dcd259d2f5bb4348890fc78bdae59d69029d4014750948f5aaf3a3e9f2" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.562336 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn" Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.564703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" event={"ID":"16168670-2e45-49f9-81bc-9127b13dfb5e","Type":"ContainerStarted","Data":"2728457a42f3eaba04db4e785fc13a0c47bbbbca45164c90257ff0b36570a285"} Mar 17 02:30:04 crc kubenswrapper[4735]: I0317 02:30:04.588410 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" podStartSLOduration=2.52715709 podStartE2EDuration="4.588389481s" podCreationTimestamp="2026-03-17 02:30:00 +0000 UTC" firstStartedPulling="2026-03-17 02:30:01.064430684 +0000 UTC m=+4826.696663662" lastFinishedPulling="2026-03-17 02:30:03.125663055 +0000 UTC m=+4828.757896053" observedRunningTime="2026-03-17 02:30:04.580909703 +0000 UTC m=+4830.213142691" watchObservedRunningTime="2026-03-17 02:30:04.588389481 +0000 UTC m=+4830.220622469" Mar 17 02:30:05 crc kubenswrapper[4735]: I0317 02:30:05.107319 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb"] Mar 17 02:30:05 crc kubenswrapper[4735]: I0317 02:30:05.119328 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-dl9qb"] Mar 17 02:30:05 crc kubenswrapper[4735]: I0317 02:30:05.657368 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:05 crc kubenswrapper[4735]: I0317 02:30:05.657416 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:06 crc kubenswrapper[4735]: I0317 02:30:06.583597 4735 generic.go:334] "Generic (PLEG): container finished" podID="16168670-2e45-49f9-81bc-9127b13dfb5e" containerID="2728457a42f3eaba04db4e785fc13a0c47bbbbca45164c90257ff0b36570a285" exitCode=0 Mar 17 02:30:06 crc kubenswrapper[4735]: I0317 02:30:06.583673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" event={"ID":"16168670-2e45-49f9-81bc-9127b13dfb5e","Type":"ContainerDied","Data":"2728457a42f3eaba04db4e785fc13a0c47bbbbca45164c90257ff0b36570a285"} Mar 17 02:30:06 crc kubenswrapper[4735]: I0317 02:30:06.719751 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-smcgw" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="registry-server" probeResult="failure" output=< Mar 17 02:30:06 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:30:06 crc kubenswrapper[4735]: > Mar 17 02:30:07 crc kubenswrapper[4735]: I0317 02:30:07.086895 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa06a89-796f-4cea-b11b-c75a723deda6" path="/var/lib/kubelet/pods/1aa06a89-796f-4cea-b11b-c75a723deda6/volumes" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.018050 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.156770 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ncbx\" (UniqueName: \"kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx\") pod \"16168670-2e45-49f9-81bc-9127b13dfb5e\" (UID: \"16168670-2e45-49f9-81bc-9127b13dfb5e\") " Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.164606 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx" (OuterVolumeSpecName: "kube-api-access-8ncbx") pod "16168670-2e45-49f9-81bc-9127b13dfb5e" (UID: "16168670-2e45-49f9-81bc-9127b13dfb5e"). InnerVolumeSpecName "kube-api-access-8ncbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.259260 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ncbx\" (UniqueName: \"kubernetes.io/projected/16168670-2e45-49f9-81bc-9127b13dfb5e-kube-api-access-8ncbx\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.599685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" event={"ID":"16168670-2e45-49f9-81bc-9127b13dfb5e","Type":"ContainerDied","Data":"eb01a77b7707a38371eeb4211df5a80b67843b2a920dca587c53b77b443a6a85"} Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.599919 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-nm5p8" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.599942 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb01a77b7707a38371eeb4211df5a80b67843b2a920dca587c53b77b443a6a85" Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.668707 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-x4n49"] Mar 17 02:30:08 crc kubenswrapper[4735]: I0317 02:30:08.687719 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-x4n49"] Mar 17 02:30:09 crc kubenswrapper[4735]: I0317 02:30:09.086092 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666386c3-46df-4527-9e07-28c12fe09982" path="/var/lib/kubelet/pods/666386c3-46df-4527-9e07-28c12fe09982/volumes" Mar 17 02:30:12 crc kubenswrapper[4735]: I0317 02:30:12.606313 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:30:12 crc kubenswrapper[4735]: I0317 02:30:12.607042 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:30:14 crc kubenswrapper[4735]: I0317 02:30:14.241004 4735 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-6xl7d container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 02:30:14 crc kubenswrapper[4735]: I0317 02:30:14.241096 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xl7d" podUID="11908f1a-0399-4964-86cc-8a2e51d35821" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 02:30:14 crc kubenswrapper[4735]: E0317 02:30:14.287703 4735 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.175s" Mar 17 02:30:15 crc kubenswrapper[4735]: I0317 02:30:15.728114 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:15 crc kubenswrapper[4735]: I0317 02:30:15.789461 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:16 crc kubenswrapper[4735]: I0317 02:30:16.866352 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.336723 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-smcgw" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="registry-server" containerID="cri-o://a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca" gracePeriod=2 Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.851728 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.974988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px9gk\" (UniqueName: \"kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk\") pod \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.975185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content\") pod \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.975281 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities\") pod \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\" (UID: \"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e\") " Mar 17 02:30:17 crc kubenswrapper[4735]: I0317 02:30:17.975708 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities" (OuterVolumeSpecName: "utilities") pod "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" (UID: "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.008992 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" (UID: "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.077632 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.077943 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.221909 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk" (OuterVolumeSpecName: "kube-api-access-px9gk") pod "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" (UID: "ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e"). InnerVolumeSpecName "kube-api-access-px9gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.281907 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px9gk\" (UniqueName: \"kubernetes.io/projected/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e-kube-api-access-px9gk\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.349503 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerID="a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca" exitCode=0 Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.349542 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerDied","Data":"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca"} Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.349611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smcgw" event={"ID":"ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e","Type":"ContainerDied","Data":"7d506ef01e068e9134bc9758bc84d6eb15193784f210556c760e801e9649b3f0"} Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.349620 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smcgw" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.351383 4735 scope.go:117] "RemoveContainer" containerID="a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.396947 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.403647 4735 scope.go:117] "RemoveContainer" containerID="add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.404670 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-smcgw"] Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.436953 4735 scope.go:117] "RemoveContainer" containerID="fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.478325 4735 scope.go:117] "RemoveContainer" containerID="a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca" Mar 17 02:30:18 crc kubenswrapper[4735]: E0317 02:30:18.483497 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca\": container with ID starting with a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca not found: ID does not exist" containerID="a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.483540 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca"} err="failed to get container status \"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca\": rpc error: code = NotFound desc = could not find container \"a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca\": container with ID starting with a1bf17158f27ba91b75dbe51e0baee4308ba63f4a005f8de4dc87c736f9b58ca not found: ID does not exist" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.483567 4735 scope.go:117] "RemoveContainer" containerID="add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b" Mar 17 02:30:18 crc kubenswrapper[4735]: E0317 02:30:18.484360 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b\": container with ID starting with add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b not found: ID does not exist" containerID="add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.484410 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b"} err="failed to get container status \"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b\": rpc error: code = NotFound desc = could not find container \"add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b\": container with ID starting with add94fef0b537711ae200e988c0370baa7d6f75c7c88de581ca9b975c338e77b not found: ID does not exist" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.484441 4735 scope.go:117] "RemoveContainer" containerID="fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe" Mar 17 02:30:18 crc kubenswrapper[4735]: E0317 02:30:18.484815 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe\": container with ID starting with fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe not found: ID does not exist" containerID="fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe" Mar 17 02:30:18 crc kubenswrapper[4735]: I0317 02:30:18.484842 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe"} err="failed to get container status \"fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe\": rpc error: code = NotFound desc = could not find container \"fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe\": container with ID starting with fb14014976f2e3b5153e32a33cef83ee71b29364cdd7892c446303a817fffcfe not found: ID does not exist" Mar 17 02:30:19 crc kubenswrapper[4735]: I0317 02:30:19.089727 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" path="/var/lib/kubelet/pods/ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e/volumes" Mar 17 02:30:42 crc kubenswrapper[4735]: I0317 02:30:42.606659 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:30:42 crc kubenswrapper[4735]: I0317 02:30:42.607226 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:31:02 crc kubenswrapper[4735]: I0317 02:31:02.274424 4735 scope.go:117] "RemoveContainer" containerID="e0132bf9d26399a9b096e4fcb665d5ec0c6d47b9a3cc0683301ef945b2af8b4e" Mar 17 02:31:02 crc kubenswrapper[4735]: I0317 02:31:02.369778 4735 scope.go:117] "RemoveContainer" containerID="d51c66507a14187c20c31775a9f95d3d36bb1cdb3f11ab2f187b4af45e58ff40" Mar 17 02:31:12 crc kubenswrapper[4735]: I0317 02:31:12.606825 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:31:12 crc kubenswrapper[4735]: I0317 02:31:12.607376 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:31:12 crc kubenswrapper[4735]: I0317 02:31:12.607414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:31:12 crc kubenswrapper[4735]: I0317 02:31:12.608065 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:31:12 crc kubenswrapper[4735]: I0317 02:31:12.608123 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436" gracePeriod=600 Mar 17 02:31:13 crc kubenswrapper[4735]: I0317 02:31:13.177958 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436" exitCode=0 Mar 17 02:31:13 crc kubenswrapper[4735]: I0317 02:31:13.178212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436"} Mar 17 02:31:13 crc kubenswrapper[4735]: I0317 02:31:13.178296 4735 scope.go:117] "RemoveContainer" containerID="88e69405afa73aec764f14293bec1363be2dadf4fcaa6524bb024b47d2f36486" Mar 17 02:31:14 crc kubenswrapper[4735]: I0317 02:31:14.192519 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41"} Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.139276 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:32 crc kubenswrapper[4735]: E0317 02:31:32.141331 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7241f2-ed80-4f1d-8c72-56826fd49958" containerName="collect-profiles" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.141353 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7241f2-ed80-4f1d-8c72-56826fd49958" containerName="collect-profiles" Mar 17 02:31:32 crc kubenswrapper[4735]: E0317 02:31:32.141394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="extract-content" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.141403 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="extract-content" Mar 17 02:31:32 crc kubenswrapper[4735]: E0317 02:31:32.141426 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16168670-2e45-49f9-81bc-9127b13dfb5e" containerName="oc" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.141437 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="16168670-2e45-49f9-81bc-9127b13dfb5e" containerName="oc" Mar 17 02:31:32 crc kubenswrapper[4735]: E0317 02:31:32.141452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="registry-server" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.141460 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="registry-server" Mar 17 02:31:32 crc kubenswrapper[4735]: E0317 02:31:32.141473 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="extract-utilities" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.141481 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="extract-utilities" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.142269 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7241f2-ed80-4f1d-8c72-56826fd49958" containerName="collect-profiles" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.142319 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="16168670-2e45-49f9-81bc-9127b13dfb5e" containerName="oc" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.142343 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba74b6b6-3a96-4ef2-a549-1bc87fe6d93e" containerName="registry-server" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.144147 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.177501 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.292814 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.292925 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.293232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5vl\" (UniqueName: \"kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.395581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5vl\" (UniqueName: \"kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.395683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.395715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.396175 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.396332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.416256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5vl\" (UniqueName: \"kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl\") pod \"certified-operators-5zlz9\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:32 crc kubenswrapper[4735]: I0317 02:31:32.474369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:33 crc kubenswrapper[4735]: I0317 02:31:33.789303 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:34 crc kubenswrapper[4735]: W0317 02:31:34.636163 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf006d722_9d17_4f5b_8267_a1543178522e.slice/crio-540bc041d94b09ad778305319f8a3df1d9c0bd4a1af105ed639ae6f5e53b08ff WatchSource:0}: Error finding container 540bc041d94b09ad778305319f8a3df1d9c0bd4a1af105ed639ae6f5e53b08ff: Status 404 returned error can't find the container with id 540bc041d94b09ad778305319f8a3df1d9c0bd4a1af105ed639ae6f5e53b08ff Mar 17 02:31:35 crc kubenswrapper[4735]: I0317 02:31:35.379235 4735 generic.go:334] "Generic (PLEG): container finished" podID="f006d722-9d17-4f5b-8267-a1543178522e" containerID="338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d" exitCode=0 Mar 17 02:31:35 crc kubenswrapper[4735]: I0317 02:31:35.379304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerDied","Data":"338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d"} Mar 17 02:31:35 crc kubenswrapper[4735]: I0317 02:31:35.379538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerStarted","Data":"540bc041d94b09ad778305319f8a3df1d9c0bd4a1af105ed639ae6f5e53b08ff"} Mar 17 02:31:37 crc kubenswrapper[4735]: I0317 02:31:37.399126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerStarted","Data":"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce"} Mar 17 02:31:39 crc kubenswrapper[4735]: I0317 02:31:39.418252 4735 generic.go:334] "Generic (PLEG): container finished" podID="f006d722-9d17-4f5b-8267-a1543178522e" containerID="a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce" exitCode=0 Mar 17 02:31:39 crc kubenswrapper[4735]: I0317 02:31:39.418284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerDied","Data":"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce"} Mar 17 02:31:40 crc kubenswrapper[4735]: I0317 02:31:40.429528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerStarted","Data":"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229"} Mar 17 02:31:40 crc kubenswrapper[4735]: I0317 02:31:40.456270 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5zlz9" podStartSLOduration=3.987701416 podStartE2EDuration="8.455641704s" podCreationTimestamp="2026-03-17 02:31:32 +0000 UTC" firstStartedPulling="2026-03-17 02:31:35.381974381 +0000 UTC m=+4921.014207349" lastFinishedPulling="2026-03-17 02:31:39.849914659 +0000 UTC m=+4925.482147637" observedRunningTime="2026-03-17 02:31:40.445633055 +0000 UTC m=+4926.077866043" watchObservedRunningTime="2026-03-17 02:31:40.455641704 +0000 UTC m=+4926.087874692" Mar 17 02:31:42 crc kubenswrapper[4735]: I0317 02:31:42.475021 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:42 crc kubenswrapper[4735]: I0317 02:31:42.475592 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:42 crc kubenswrapper[4735]: I0317 02:31:42.529921 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:52 crc kubenswrapper[4735]: I0317 02:31:52.533728 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:52 crc kubenswrapper[4735]: I0317 02:31:52.631947 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:52 crc kubenswrapper[4735]: I0317 02:31:52.632681 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5zlz9" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="registry-server" containerID="cri-o://d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229" gracePeriod=2 Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.239136 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.272695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw5vl\" (UniqueName: \"kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl\") pod \"f006d722-9d17-4f5b-8267-a1543178522e\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.272763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content\") pod \"f006d722-9d17-4f5b-8267-a1543178522e\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.272848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities\") pod \"f006d722-9d17-4f5b-8267-a1543178522e\" (UID: \"f006d722-9d17-4f5b-8267-a1543178522e\") " Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.279286 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities" (OuterVolumeSpecName: "utilities") pod "f006d722-9d17-4f5b-8267-a1543178522e" (UID: "f006d722-9d17-4f5b-8267-a1543178522e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.293330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl" (OuterVolumeSpecName: "kube-api-access-jw5vl") pod "f006d722-9d17-4f5b-8267-a1543178522e" (UID: "f006d722-9d17-4f5b-8267-a1543178522e"). InnerVolumeSpecName "kube-api-access-jw5vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.377283 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw5vl\" (UniqueName: \"kubernetes.io/projected/f006d722-9d17-4f5b-8267-a1543178522e-kube-api-access-jw5vl\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.377593 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.388169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f006d722-9d17-4f5b-8267-a1543178522e" (UID: "f006d722-9d17-4f5b-8267-a1543178522e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.482097 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f006d722-9d17-4f5b-8267-a1543178522e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.585843 4735 generic.go:334] "Generic (PLEG): container finished" podID="f006d722-9d17-4f5b-8267-a1543178522e" containerID="d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229" exitCode=0 Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.585945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerDied","Data":"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229"} Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.585985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zlz9" event={"ID":"f006d722-9d17-4f5b-8267-a1543178522e","Type":"ContainerDied","Data":"540bc041d94b09ad778305319f8a3df1d9c0bd4a1af105ed639ae6f5e53b08ff"} Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.586019 4735 scope.go:117] "RemoveContainer" containerID="d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.586208 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zlz9" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.646621 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.649092 4735 scope.go:117] "RemoveContainer" containerID="a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.657066 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5zlz9"] Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.678397 4735 scope.go:117] "RemoveContainer" containerID="338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.756477 4735 scope.go:117] "RemoveContainer" containerID="d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229" Mar 17 02:31:53 crc kubenswrapper[4735]: E0317 02:31:53.758757 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229\": container with ID starting with d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229 not found: ID does not exist" containerID="d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.758917 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229"} err="failed to get container status \"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229\": rpc error: code = NotFound desc = could not find container \"d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229\": container with ID starting with d1bff71507e55860d0e4d3ee93bc12db6c15c12e63db919d435f16f670916229 not found: ID does not exist" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.758982 4735 scope.go:117] "RemoveContainer" containerID="a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce" Mar 17 02:31:53 crc kubenswrapper[4735]: E0317 02:31:53.760669 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce\": container with ID starting with a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce not found: ID does not exist" containerID="a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.760787 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce"} err="failed to get container status \"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce\": rpc error: code = NotFound desc = could not find container \"a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce\": container with ID starting with a6ddddf3eb9838b4bb49414f37459b040cb6c95d6741050cacbf81c8326af6ce not found: ID does not exist" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.760834 4735 scope.go:117] "RemoveContainer" containerID="338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d" Mar 17 02:31:53 crc kubenswrapper[4735]: E0317 02:31:53.761325 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d\": container with ID starting with 338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d not found: ID does not exist" containerID="338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d" Mar 17 02:31:53 crc kubenswrapper[4735]: I0317 02:31:53.761384 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d"} err="failed to get container status \"338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d\": rpc error: code = NotFound desc = could not find container \"338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d\": container with ID starting with 338ea4579f18cba700d1eb6ed2b715efc08dd4e3ab1856e00dcf5fedc75c442d not found: ID does not exist" Mar 17 02:31:55 crc kubenswrapper[4735]: I0317 02:31:55.094157 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f006d722-9d17-4f5b-8267-a1543178522e" path="/var/lib/kubelet/pods/f006d722-9d17-4f5b-8267-a1543178522e/volumes" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.206788 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561912-6lzxg"] Mar 17 02:32:00 crc kubenswrapper[4735]: E0317 02:32:00.207618 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="registry-server" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.207630 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="registry-server" Mar 17 02:32:00 crc kubenswrapper[4735]: E0317 02:32:00.207659 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="extract-utilities" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.207666 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="extract-utilities" Mar 17 02:32:00 crc kubenswrapper[4735]: E0317 02:32:00.207686 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="extract-content" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.207691 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="extract-content" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.207897 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f006d722-9d17-4f5b-8267-a1543178522e" containerName="registry-server" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.209181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.220111 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.220114 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.220125 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.230389 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-6lzxg"] Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.330735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wzq\" (UniqueName: \"kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq\") pod \"auto-csr-approver-29561912-6lzxg\" (UID: \"0c3aa9ac-fe44-4755-85f6-e430d301286d\") " pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.432836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wzq\" (UniqueName: \"kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq\") pod \"auto-csr-approver-29561912-6lzxg\" (UID: \"0c3aa9ac-fe44-4755-85f6-e430d301286d\") " pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.462636 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wzq\" (UniqueName: \"kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq\") pod \"auto-csr-approver-29561912-6lzxg\" (UID: \"0c3aa9ac-fe44-4755-85f6-e430d301286d\") " pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:00 crc kubenswrapper[4735]: I0317 02:32:00.533728 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:01 crc kubenswrapper[4735]: I0317 02:32:01.051734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-6lzxg"] Mar 17 02:32:01 crc kubenswrapper[4735]: I0317 02:32:01.668769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" event={"ID":"0c3aa9ac-fe44-4755-85f6-e430d301286d","Type":"ContainerStarted","Data":"3f4a9baa0d398b9af0c1abbe7aec415c9aeb95947da7ba2ed9c3fa2efa902152"} Mar 17 02:32:03 crc kubenswrapper[4735]: I0317 02:32:03.689133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" event={"ID":"0c3aa9ac-fe44-4755-85f6-e430d301286d","Type":"ContainerStarted","Data":"c5e966f95b8b2609776da84a2ebf88be0666c6826e8918c1c565b4145c689d65"} Mar 17 02:32:03 crc kubenswrapper[4735]: I0317 02:32:03.706046 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" podStartSLOduration=2.584611695 podStartE2EDuration="3.706019515s" podCreationTimestamp="2026-03-17 02:32:00 +0000 UTC" firstStartedPulling="2026-03-17 02:32:01.056808333 +0000 UTC m=+4946.689041331" lastFinishedPulling="2026-03-17 02:32:02.178216143 +0000 UTC m=+4947.810449151" observedRunningTime="2026-03-17 02:32:03.700152845 +0000 UTC m=+4949.332385823" watchObservedRunningTime="2026-03-17 02:32:03.706019515 +0000 UTC m=+4949.338252493" Mar 17 02:32:04 crc kubenswrapper[4735]: I0317 02:32:04.703191 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c3aa9ac-fe44-4755-85f6-e430d301286d" containerID="c5e966f95b8b2609776da84a2ebf88be0666c6826e8918c1c565b4145c689d65" exitCode=0 Mar 17 02:32:04 crc kubenswrapper[4735]: I0317 02:32:04.703392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" event={"ID":"0c3aa9ac-fe44-4755-85f6-e430d301286d","Type":"ContainerDied","Data":"c5e966f95b8b2609776da84a2ebf88be0666c6826e8918c1c565b4145c689d65"} Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.181834 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.357779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wzq\" (UniqueName: \"kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq\") pod \"0c3aa9ac-fe44-4755-85f6-e430d301286d\" (UID: \"0c3aa9ac-fe44-4755-85f6-e430d301286d\") " Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.365848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq" (OuterVolumeSpecName: "kube-api-access-26wzq") pod "0c3aa9ac-fe44-4755-85f6-e430d301286d" (UID: "0c3aa9ac-fe44-4755-85f6-e430d301286d"). InnerVolumeSpecName "kube-api-access-26wzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.460768 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wzq\" (UniqueName: \"kubernetes.io/projected/0c3aa9ac-fe44-4755-85f6-e430d301286d-kube-api-access-26wzq\") on node \"crc\" DevicePath \"\"" Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.732251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" event={"ID":"0c3aa9ac-fe44-4755-85f6-e430d301286d","Type":"ContainerDied","Data":"3f4a9baa0d398b9af0c1abbe7aec415c9aeb95947da7ba2ed9c3fa2efa902152"} Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.732286 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4a9baa0d398b9af0c1abbe7aec415c9aeb95947da7ba2ed9c3fa2efa902152" Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.732387 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-6lzxg" Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.816176 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-r59ns"] Mar 17 02:32:06 crc kubenswrapper[4735]: I0317 02:32:06.827762 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-r59ns"] Mar 17 02:32:07 crc kubenswrapper[4735]: I0317 02:32:07.086414 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61fb0b2f-feee-420b-b167-426a549dacfc" path="/var/lib/kubelet/pods/61fb0b2f-feee-420b-b167-426a549dacfc/volumes" Mar 17 02:33:02 crc kubenswrapper[4735]: I0317 02:33:02.633440 4735 scope.go:117] "RemoveContainer" containerID="6b7e4e5e2c9526727ce32b2b198c1b48b3d21e719b8b6f4cd54f8c0639d53942" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.825851 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:33:41 crc kubenswrapper[4735]: E0317 02:33:41.826767 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3aa9ac-fe44-4755-85f6-e430d301286d" containerName="oc" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.826781 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3aa9ac-fe44-4755-85f6-e430d301286d" containerName="oc" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.827001 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3aa9ac-fe44-4755-85f6-e430d301286d" containerName="oc" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.828452 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.835116 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.941533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxsf\" (UniqueName: \"kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.941598 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:41 crc kubenswrapper[4735]: I0317 02:33:41.941630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.043840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxsf\" (UniqueName: \"kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.043943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.043986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.044637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.044964 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.062565 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxsf\" (UniqueName: \"kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf\") pod \"community-operators-f6rlb\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.144566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.598228 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.607039 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:33:42 crc kubenswrapper[4735]: I0317 02:33:42.607102 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:33:43 crc kubenswrapper[4735]: I0317 02:33:43.711208 4735 generic.go:334] "Generic (PLEG): container finished" podID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerID="5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e" exitCode=0 Mar 17 02:33:43 crc kubenswrapper[4735]: I0317 02:33:43.711401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerDied","Data":"5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e"} Mar 17 02:33:43 crc kubenswrapper[4735]: I0317 02:33:43.711547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerStarted","Data":"d76fb352882aac8dc4ba989ddc7834e994d1d463978caadc32df5f1a6ec42455"} Mar 17 02:33:44 crc kubenswrapper[4735]: I0317 02:33:44.720418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerStarted","Data":"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead"} Mar 17 02:33:46 crc kubenswrapper[4735]: I0317 02:33:46.743746 4735 generic.go:334] "Generic (PLEG): container finished" podID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerID="58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead" exitCode=0 Mar 17 02:33:46 crc kubenswrapper[4735]: I0317 02:33:46.744161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerDied","Data":"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead"} Mar 17 02:33:47 crc kubenswrapper[4735]: I0317 02:33:47.757848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerStarted","Data":"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7"} Mar 17 02:33:47 crc kubenswrapper[4735]: I0317 02:33:47.784798 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f6rlb" podStartSLOduration=3.354982191 podStartE2EDuration="6.78477717s" podCreationTimestamp="2026-03-17 02:33:41 +0000 UTC" firstStartedPulling="2026-03-17 02:33:43.713275142 +0000 UTC m=+5049.345508120" lastFinishedPulling="2026-03-17 02:33:47.143070091 +0000 UTC m=+5052.775303099" observedRunningTime="2026-03-17 02:33:47.775731864 +0000 UTC m=+5053.407964852" watchObservedRunningTime="2026-03-17 02:33:47.78477717 +0000 UTC m=+5053.417010158" Mar 17 02:33:52 crc kubenswrapper[4735]: I0317 02:33:52.144950 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:52 crc kubenswrapper[4735]: I0317 02:33:52.145457 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:33:53 crc kubenswrapper[4735]: I0317 02:33:53.208983 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-f6rlb" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="registry-server" probeResult="failure" output=< Mar 17 02:33:53 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:33:53 crc kubenswrapper[4735]: > Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.143077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561914-q26dv"] Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.144737 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.147389 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.147547 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.147639 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.154182 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-q26dv"] Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.298027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz6j\" (UniqueName: \"kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j\") pod \"auto-csr-approver-29561914-q26dv\" (UID: \"ba73d723-c937-438b-9ad8-3fc8baef218c\") " pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.399889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz6j\" (UniqueName: \"kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j\") pod \"auto-csr-approver-29561914-q26dv\" (UID: \"ba73d723-c937-438b-9ad8-3fc8baef218c\") " pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.429007 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz6j\" (UniqueName: \"kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j\") pod \"auto-csr-approver-29561914-q26dv\" (UID: \"ba73d723-c937-438b-9ad8-3fc8baef218c\") " pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:00 crc kubenswrapper[4735]: I0317 02:34:00.476030 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:01 crc kubenswrapper[4735]: I0317 02:34:01.335067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-q26dv"] Mar 17 02:34:01 crc kubenswrapper[4735]: I0317 02:34:01.882571 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-q26dv" event={"ID":"ba73d723-c937-438b-9ad8-3fc8baef218c","Type":"ContainerStarted","Data":"a53da8d008844f42503c2b1bb4cc4869afd0b4f58ad1c4c5ee2ce466bf092184"} Mar 17 02:34:02 crc kubenswrapper[4735]: I0317 02:34:02.196275 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:34:02 crc kubenswrapper[4735]: I0317 02:34:02.262779 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:34:02 crc kubenswrapper[4735]: I0317 02:34:02.437831 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:34:03 crc kubenswrapper[4735]: I0317 02:34:03.904587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-q26dv" event={"ID":"ba73d723-c937-438b-9ad8-3fc8baef218c","Type":"ContainerStarted","Data":"d0205944742ee5fa5f479ba56b720aaf040d5b61bfb09ad2c27dfe3d5ed87dd3"} Mar 17 02:34:03 crc kubenswrapper[4735]: I0317 02:34:03.904647 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f6rlb" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="registry-server" containerID="cri-o://82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7" gracePeriod=2 Mar 17 02:34:03 crc kubenswrapper[4735]: I0317 02:34:03.934180 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561914-q26dv" podStartSLOduration=2.980238336 podStartE2EDuration="3.934154538s" podCreationTimestamp="2026-03-17 02:34:00 +0000 UTC" firstStartedPulling="2026-03-17 02:34:01.358760166 +0000 UTC m=+5066.990993144" lastFinishedPulling="2026-03-17 02:34:02.312676368 +0000 UTC m=+5067.944909346" observedRunningTime="2026-03-17 02:34:03.929403273 +0000 UTC m=+5069.561636261" watchObservedRunningTime="2026-03-17 02:34:03.934154538 +0000 UTC m=+5069.566387556" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.595757 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.690507 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities\") pod \"50f7c747-8e1b-439a-aad9-30bec7839cb4\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.690750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content\") pod \"50f7c747-8e1b-439a-aad9-30bec7839cb4\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.691332 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxsf\" (UniqueName: \"kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf\") pod \"50f7c747-8e1b-439a-aad9-30bec7839cb4\" (UID: \"50f7c747-8e1b-439a-aad9-30bec7839cb4\") " Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.691776 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities" (OuterVolumeSpecName: "utilities") pod "50f7c747-8e1b-439a-aad9-30bec7839cb4" (UID: "50f7c747-8e1b-439a-aad9-30bec7839cb4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.722014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf" (OuterVolumeSpecName: "kube-api-access-lqxsf") pod "50f7c747-8e1b-439a-aad9-30bec7839cb4" (UID: "50f7c747-8e1b-439a-aad9-30bec7839cb4"). InnerVolumeSpecName "kube-api-access-lqxsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.776811 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50f7c747-8e1b-439a-aad9-30bec7839cb4" (UID: "50f7c747-8e1b-439a-aad9-30bec7839cb4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.793466 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxsf\" (UniqueName: \"kubernetes.io/projected/50f7c747-8e1b-439a-aad9-30bec7839cb4-kube-api-access-lqxsf\") on node \"crc\" DevicePath \"\"" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.793733 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.793799 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50f7c747-8e1b-439a-aad9-30bec7839cb4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.917966 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba73d723-c937-438b-9ad8-3fc8baef218c" containerID="d0205944742ee5fa5f479ba56b720aaf040d5b61bfb09ad2c27dfe3d5ed87dd3" exitCode=0 Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.918098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-q26dv" event={"ID":"ba73d723-c937-438b-9ad8-3fc8baef218c","Type":"ContainerDied","Data":"d0205944742ee5fa5f479ba56b720aaf040d5b61bfb09ad2c27dfe3d5ed87dd3"} Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.921919 4735 generic.go:334] "Generic (PLEG): container finished" podID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerID="82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7" exitCode=0 Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.921948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerDied","Data":"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7"} Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.921964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6rlb" event={"ID":"50f7c747-8e1b-439a-aad9-30bec7839cb4","Type":"ContainerDied","Data":"d76fb352882aac8dc4ba989ddc7834e994d1d463978caadc32df5f1a6ec42455"} Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.921982 4735 scope.go:117] "RemoveContainer" containerID="82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.922124 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6rlb" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.951621 4735 scope.go:117] "RemoveContainer" containerID="58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead" Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.965427 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.978190 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f6rlb"] Mar 17 02:34:04 crc kubenswrapper[4735]: I0317 02:34:04.989616 4735 scope.go:117] "RemoveContainer" containerID="5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.021122 4735 scope.go:117] "RemoveContainer" containerID="82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7" Mar 17 02:34:05 crc kubenswrapper[4735]: E0317 02:34:05.023315 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7\": container with ID starting with 82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7 not found: ID does not exist" containerID="82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.023353 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7"} err="failed to get container status \"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7\": rpc error: code = NotFound desc = could not find container \"82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7\": container with ID starting with 82fdac69adba56c4f0daa62dca0e55aed76445164e20f5214b9f5a8808cffad7 not found: ID does not exist" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.023385 4735 scope.go:117] "RemoveContainer" containerID="58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead" Mar 17 02:34:05 crc kubenswrapper[4735]: E0317 02:34:05.023713 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead\": container with ID starting with 58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead not found: ID does not exist" containerID="58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.023741 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead"} err="failed to get container status \"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead\": rpc error: code = NotFound desc = could not find container \"58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead\": container with ID starting with 58e280e74d851de5876142571bf05b27c98e0fe7ec29e65cfe5c8991f1756ead not found: ID does not exist" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.023754 4735 scope.go:117] "RemoveContainer" containerID="5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e" Mar 17 02:34:05 crc kubenswrapper[4735]: E0317 02:34:05.023972 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e\": container with ID starting with 5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e not found: ID does not exist" containerID="5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.023992 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e"} err="failed to get container status \"5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e\": rpc error: code = NotFound desc = could not find container \"5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e\": container with ID starting with 5908621c98ba4ce73a4ae455714f07549508f0605ddebde24122253e2fb1192e not found: ID does not exist" Mar 17 02:34:05 crc kubenswrapper[4735]: I0317 02:34:05.097026 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" path="/var/lib/kubelet/pods/50f7c747-8e1b-439a-aad9-30bec7839cb4/volumes" Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.355120 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.529312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz6j\" (UniqueName: \"kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j\") pod \"ba73d723-c937-438b-9ad8-3fc8baef218c\" (UID: \"ba73d723-c937-438b-9ad8-3fc8baef218c\") " Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.537163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j" (OuterVolumeSpecName: "kube-api-access-fhz6j") pod "ba73d723-c937-438b-9ad8-3fc8baef218c" (UID: "ba73d723-c937-438b-9ad8-3fc8baef218c"). InnerVolumeSpecName "kube-api-access-fhz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.631378 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhz6j\" (UniqueName: \"kubernetes.io/projected/ba73d723-c937-438b-9ad8-3fc8baef218c-kube-api-access-fhz6j\") on node \"crc\" DevicePath \"\"" Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.946074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-q26dv" event={"ID":"ba73d723-c937-438b-9ad8-3fc8baef218c","Type":"ContainerDied","Data":"a53da8d008844f42503c2b1bb4cc4869afd0b4f58ad1c4c5ee2ce466bf092184"} Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.946402 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a53da8d008844f42503c2b1bb4cc4869afd0b4f58ad1c4c5ee2ce466bf092184" Mar 17 02:34:06 crc kubenswrapper[4735]: I0317 02:34:06.946126 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-q26dv" Mar 17 02:34:07 crc kubenswrapper[4735]: I0317 02:34:07.007576 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-4nzx5"] Mar 17 02:34:07 crc kubenswrapper[4735]: I0317 02:34:07.015010 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-4nzx5"] Mar 17 02:34:07 crc kubenswrapper[4735]: I0317 02:34:07.081960 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fc6d4b-0c5d-40db-8c5c-844a377493c9" path="/var/lib/kubelet/pods/43fc6d4b-0c5d-40db-8c5c-844a377493c9/volumes" Mar 17 02:34:12 crc kubenswrapper[4735]: I0317 02:34:12.606502 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:34:12 crc kubenswrapper[4735]: I0317 02:34:12.607046 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:34:42 crc kubenswrapper[4735]: I0317 02:34:42.607103 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:34:42 crc kubenswrapper[4735]: I0317 02:34:42.607825 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:34:42 crc kubenswrapper[4735]: I0317 02:34:42.607936 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:34:42 crc kubenswrapper[4735]: I0317 02:34:42.609311 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:34:42 crc kubenswrapper[4735]: I0317 02:34:42.609443 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" gracePeriod=600 Mar 17 02:34:42 crc kubenswrapper[4735]: E0317 02:34:42.753717 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:34:43 crc kubenswrapper[4735]: I0317 02:34:43.318794 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" exitCode=0 Mar 17 02:34:43 crc kubenswrapper[4735]: I0317 02:34:43.318832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41"} Mar 17 02:34:43 crc kubenswrapper[4735]: I0317 02:34:43.319251 4735 scope.go:117] "RemoveContainer" containerID="9ea8c9e239b85e42bd63a64b8e1b26689aa00fea8e35b1d69c29632cfc947436" Mar 17 02:34:43 crc kubenswrapper[4735]: I0317 02:34:43.319973 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:34:43 crc kubenswrapper[4735]: E0317 02:34:43.320257 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:34:56 crc kubenswrapper[4735]: I0317 02:34:56.073443 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:34:56 crc kubenswrapper[4735]: E0317 02:34:56.074290 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:35:02 crc kubenswrapper[4735]: I0317 02:35:02.916307 4735 scope.go:117] "RemoveContainer" containerID="ac49f3dbfc1c2f8e1f97bc550f97aedd2f1c0f9ac82e6bf406f4e906d32b4258" Mar 17 02:35:07 crc kubenswrapper[4735]: I0317 02:35:07.072935 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:35:07 crc kubenswrapper[4735]: E0317 02:35:07.073754 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:35:19 crc kubenswrapper[4735]: I0317 02:35:19.073344 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:35:19 crc kubenswrapper[4735]: E0317 02:35:19.073992 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.903520 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:35:26 crc kubenswrapper[4735]: E0317 02:35:26.904517 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="extract-content" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.904533 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="extract-content" Mar 17 02:35:26 crc kubenswrapper[4735]: E0317 02:35:26.904553 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba73d723-c937-438b-9ad8-3fc8baef218c" containerName="oc" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.904561 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba73d723-c937-438b-9ad8-3fc8baef218c" containerName="oc" Mar 17 02:35:26 crc kubenswrapper[4735]: E0317 02:35:26.904586 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="registry-server" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.904615 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="registry-server" Mar 17 02:35:26 crc kubenswrapper[4735]: E0317 02:35:26.904658 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="extract-utilities" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.904669 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="extract-utilities" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.904983 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f7c747-8e1b-439a-aad9-30bec7839cb4" containerName="registry-server" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.905019 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba73d723-c937-438b-9ad8-3fc8baef218c" containerName="oc" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.909346 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:26 crc kubenswrapper[4735]: I0317 02:35:26.921646 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.043344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.043434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kbp\" (UniqueName: \"kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.043475 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.144892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.144986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kbp\" (UniqueName: \"kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.145029 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.145470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.145994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.522081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kbp\" (UniqueName: \"kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp\") pod \"redhat-operators-sskrv\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:27 crc kubenswrapper[4735]: I0317 02:35:27.553393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:28 crc kubenswrapper[4735]: I0317 02:35:28.050023 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:35:28 crc kubenswrapper[4735]: W0317 02:35:28.052696 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d652b4_cf32_4f7e_af9f_e3371c0381a7.slice/crio-075e73d8c45ba2b588cd0d38875b56684169c978309f8cfe8b5c528d9bbf5a9a WatchSource:0}: Error finding container 075e73d8c45ba2b588cd0d38875b56684169c978309f8cfe8b5c528d9bbf5a9a: Status 404 returned error can't find the container with id 075e73d8c45ba2b588cd0d38875b56684169c978309f8cfe8b5c528d9bbf5a9a Mar 17 02:35:28 crc kubenswrapper[4735]: I0317 02:35:28.745364 4735 generic.go:334] "Generic (PLEG): container finished" podID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerID="c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57" exitCode=0 Mar 17 02:35:28 crc kubenswrapper[4735]: I0317 02:35:28.745462 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerDied","Data":"c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57"} Mar 17 02:35:28 crc kubenswrapper[4735]: I0317 02:35:28.746024 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerStarted","Data":"075e73d8c45ba2b588cd0d38875b56684169c978309f8cfe8b5c528d9bbf5a9a"} Mar 17 02:35:28 crc kubenswrapper[4735]: I0317 02:35:28.748964 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:35:29 crc kubenswrapper[4735]: I0317 02:35:29.755650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerStarted","Data":"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1"} Mar 17 02:35:32 crc kubenswrapper[4735]: I0317 02:35:32.072972 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:35:32 crc kubenswrapper[4735]: E0317 02:35:32.073550 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:35:34 crc kubenswrapper[4735]: I0317 02:35:34.822234 4735 generic.go:334] "Generic (PLEG): container finished" podID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerID="1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1" exitCode=0 Mar 17 02:35:34 crc kubenswrapper[4735]: I0317 02:35:34.822366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerDied","Data":"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1"} Mar 17 02:35:35 crc kubenswrapper[4735]: I0317 02:35:35.833707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerStarted","Data":"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e"} Mar 17 02:35:35 crc kubenswrapper[4735]: I0317 02:35:35.877382 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sskrv" podStartSLOduration=3.243275274 podStartE2EDuration="9.877350717s" podCreationTimestamp="2026-03-17 02:35:26 +0000 UTC" firstStartedPulling="2026-03-17 02:35:28.747582559 +0000 UTC m=+5154.379815537" lastFinishedPulling="2026-03-17 02:35:35.381657972 +0000 UTC m=+5161.013890980" observedRunningTime="2026-03-17 02:35:35.859484549 +0000 UTC m=+5161.491717527" watchObservedRunningTime="2026-03-17 02:35:35.877350717 +0000 UTC m=+5161.509583725" Mar 17 02:35:37 crc kubenswrapper[4735]: I0317 02:35:37.554449 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:37 crc kubenswrapper[4735]: I0317 02:35:37.554764 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:35:38 crc kubenswrapper[4735]: I0317 02:35:38.617085 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sskrv" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" probeResult="failure" output=< Mar 17 02:35:38 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:35:38 crc kubenswrapper[4735]: > Mar 17 02:35:46 crc kubenswrapper[4735]: I0317 02:35:46.073588 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:35:46 crc kubenswrapper[4735]: E0317 02:35:46.074476 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:35:48 crc kubenswrapper[4735]: I0317 02:35:48.623367 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sskrv" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" probeResult="failure" output=< Mar 17 02:35:48 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:35:48 crc kubenswrapper[4735]: > Mar 17 02:35:58 crc kubenswrapper[4735]: I0317 02:35:58.888441 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sskrv" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" probeResult="failure" output=< Mar 17 02:35:58 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:35:58 crc kubenswrapper[4735]: > Mar 17 02:35:59 crc kubenswrapper[4735]: I0317 02:35:59.075040 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:35:59 crc kubenswrapper[4735]: E0317 02:35:59.075420 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.150681 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561916-frgcv"] Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.153177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.156747 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.157640 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.161675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-frgcv"] Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.164722 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.302758 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhhj\" (UniqueName: \"kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj\") pod \"auto-csr-approver-29561916-frgcv\" (UID: \"00a573ac-bdf0-41a5-847c-513987633028\") " pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.404653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhhj\" (UniqueName: \"kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj\") pod \"auto-csr-approver-29561916-frgcv\" (UID: \"00a573ac-bdf0-41a5-847c-513987633028\") " pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.445488 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhhj\" (UniqueName: \"kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj\") pod \"auto-csr-approver-29561916-frgcv\" (UID: \"00a573ac-bdf0-41a5-847c-513987633028\") " pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:00 crc kubenswrapper[4735]: I0317 02:36:00.480892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:01 crc kubenswrapper[4735]: I0317 02:36:01.138003 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-frgcv"] Mar 17 02:36:02 crc kubenswrapper[4735]: I0317 02:36:02.089624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-frgcv" event={"ID":"00a573ac-bdf0-41a5-847c-513987633028","Type":"ContainerStarted","Data":"544fc4cfb1dee5bd91ddb5ffa2c2cb11825f86bf107d5f5e5345a5e20c74076c"} Mar 17 02:36:03 crc kubenswrapper[4735]: I0317 02:36:03.099083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-frgcv" event={"ID":"00a573ac-bdf0-41a5-847c-513987633028","Type":"ContainerStarted","Data":"1ac628716a0faa23d402d3e05915b9835641e2c2765fe0cfda168639258d5cab"} Mar 17 02:36:03 crc kubenswrapper[4735]: I0317 02:36:03.121169 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561916-frgcv" podStartSLOduration=2.271888245 podStartE2EDuration="3.121146191s" podCreationTimestamp="2026-03-17 02:36:00 +0000 UTC" firstStartedPulling="2026-03-17 02:36:01.149642064 +0000 UTC m=+5186.781875042" lastFinishedPulling="2026-03-17 02:36:01.99890001 +0000 UTC m=+5187.631132988" observedRunningTime="2026-03-17 02:36:03.113830535 +0000 UTC m=+5188.746063513" watchObservedRunningTime="2026-03-17 02:36:03.121146191 +0000 UTC m=+5188.753379189" Mar 17 02:36:04 crc kubenswrapper[4735]: I0317 02:36:04.112447 4735 generic.go:334] "Generic (PLEG): container finished" podID="00a573ac-bdf0-41a5-847c-513987633028" containerID="1ac628716a0faa23d402d3e05915b9835641e2c2765fe0cfda168639258d5cab" exitCode=0 Mar 17 02:36:04 crc kubenswrapper[4735]: I0317 02:36:04.112543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-frgcv" event={"ID":"00a573ac-bdf0-41a5-847c-513987633028","Type":"ContainerDied","Data":"1ac628716a0faa23d402d3e05915b9835641e2c2765fe0cfda168639258d5cab"} Mar 17 02:36:05 crc kubenswrapper[4735]: I0317 02:36:05.747427 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:05 crc kubenswrapper[4735]: I0317 02:36:05.813705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhhj\" (UniqueName: \"kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj\") pod \"00a573ac-bdf0-41a5-847c-513987633028\" (UID: \"00a573ac-bdf0-41a5-847c-513987633028\") " Mar 17 02:36:05 crc kubenswrapper[4735]: I0317 02:36:05.822754 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj" (OuterVolumeSpecName: "kube-api-access-xkhhj") pod "00a573ac-bdf0-41a5-847c-513987633028" (UID: "00a573ac-bdf0-41a5-847c-513987633028"). InnerVolumeSpecName "kube-api-access-xkhhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:05 crc kubenswrapper[4735]: I0317 02:36:05.917010 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhhj\" (UniqueName: \"kubernetes.io/projected/00a573ac-bdf0-41a5-847c-513987633028-kube-api-access-xkhhj\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:06 crc kubenswrapper[4735]: I0317 02:36:06.135632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-frgcv" event={"ID":"00a573ac-bdf0-41a5-847c-513987633028","Type":"ContainerDied","Data":"544fc4cfb1dee5bd91ddb5ffa2c2cb11825f86bf107d5f5e5345a5e20c74076c"} Mar 17 02:36:06 crc kubenswrapper[4735]: I0317 02:36:06.135691 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="544fc4cfb1dee5bd91ddb5ffa2c2cb11825f86bf107d5f5e5345a5e20c74076c" Mar 17 02:36:06 crc kubenswrapper[4735]: I0317 02:36:06.135768 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-frgcv" Mar 17 02:36:06 crc kubenswrapper[4735]: I0317 02:36:06.195119 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-nm5p8"] Mar 17 02:36:06 crc kubenswrapper[4735]: I0317 02:36:06.206954 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-nm5p8"] Mar 17 02:36:07 crc kubenswrapper[4735]: I0317 02:36:07.095543 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16168670-2e45-49f9-81bc-9127b13dfb5e" path="/var/lib/kubelet/pods/16168670-2e45-49f9-81bc-9127b13dfb5e/volumes" Mar 17 02:36:07 crc kubenswrapper[4735]: I0317 02:36:07.642184 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:36:07 crc kubenswrapper[4735]: I0317 02:36:07.719022 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:36:07 crc kubenswrapper[4735]: I0317 02:36:07.901512 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.164930 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sskrv" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" containerID="cri-o://6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e" gracePeriod=2 Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.840246 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.908972 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content\") pod \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.909125 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4kbp\" (UniqueName: \"kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp\") pod \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.909196 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities\") pod \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\" (UID: \"18d652b4-cf32-4f7e-af9f-e3371c0381a7\") " Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.910215 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities" (OuterVolumeSpecName: "utilities") pod "18d652b4-cf32-4f7e-af9f-e3371c0381a7" (UID: "18d652b4-cf32-4f7e-af9f-e3371c0381a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:36:09 crc kubenswrapper[4735]: I0317 02:36:09.922413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp" (OuterVolumeSpecName: "kube-api-access-v4kbp") pod "18d652b4-cf32-4f7e-af9f-e3371c0381a7" (UID: "18d652b4-cf32-4f7e-af9f-e3371c0381a7"). InnerVolumeSpecName "kube-api-access-v4kbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.011087 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4kbp\" (UniqueName: \"kubernetes.io/projected/18d652b4-cf32-4f7e-af9f-e3371c0381a7-kube-api-access-v4kbp\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.011124 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.069314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18d652b4-cf32-4f7e-af9f-e3371c0381a7" (UID: "18d652b4-cf32-4f7e-af9f-e3371c0381a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.112785 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d652b4-cf32-4f7e-af9f-e3371c0381a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.180009 4735 generic.go:334] "Generic (PLEG): container finished" podID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerID="6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e" exitCode=0 Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.180074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerDied","Data":"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e"} Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.180114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sskrv" event={"ID":"18d652b4-cf32-4f7e-af9f-e3371c0381a7","Type":"ContainerDied","Data":"075e73d8c45ba2b588cd0d38875b56684169c978309f8cfe8b5c528d9bbf5a9a"} Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.180145 4735 scope.go:117] "RemoveContainer" containerID="6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.180346 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sskrv" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.209997 4735 scope.go:117] "RemoveContainer" containerID="1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.237117 4735 scope.go:117] "RemoveContainer" containerID="c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.239394 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.249720 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sskrv"] Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.295205 4735 scope.go:117] "RemoveContainer" containerID="6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e" Mar 17 02:36:10 crc kubenswrapper[4735]: E0317 02:36:10.299408 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e\": container with ID starting with 6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e not found: ID does not exist" containerID="6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.299452 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e"} err="failed to get container status \"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e\": rpc error: code = NotFound desc = could not find container \"6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e\": container with ID starting with 6d40bc5b5e4572ff174a0b236680b8eb1ef8226699100dbda600100844227d0e not found: ID does not exist" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.299486 4735 scope.go:117] "RemoveContainer" containerID="1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1" Mar 17 02:36:10 crc kubenswrapper[4735]: E0317 02:36:10.301328 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1\": container with ID starting with 1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1 not found: ID does not exist" containerID="1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.301349 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1"} err="failed to get container status \"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1\": rpc error: code = NotFound desc = could not find container \"1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1\": container with ID starting with 1362f5b87bac78f81db04f104933fa97445a77bce485c32d539bab9b3bca88e1 not found: ID does not exist" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.301363 4735 scope.go:117] "RemoveContainer" containerID="c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57" Mar 17 02:36:10 crc kubenswrapper[4735]: E0317 02:36:10.301928 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57\": container with ID starting with c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57 not found: ID does not exist" containerID="c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57" Mar 17 02:36:10 crc kubenswrapper[4735]: I0317 02:36:10.301950 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57"} err="failed to get container status \"c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57\": rpc error: code = NotFound desc = could not find container \"c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57\": container with ID starting with c29d32b7b348954b5b7a04be1d2b3d3b126db13cab521f6c0e0e32728ad1be57 not found: ID does not exist" Mar 17 02:36:11 crc kubenswrapper[4735]: I0317 02:36:11.088025 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" path="/var/lib/kubelet/pods/18d652b4-cf32-4f7e-af9f-e3371c0381a7/volumes" Mar 17 02:36:13 crc kubenswrapper[4735]: I0317 02:36:13.073337 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:36:13 crc kubenswrapper[4735]: E0317 02:36:13.074189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:36:24 crc kubenswrapper[4735]: I0317 02:36:24.073830 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:36:24 crc kubenswrapper[4735]: E0317 02:36:24.075003 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:36:35 crc kubenswrapper[4735]: I0317 02:36:35.078984 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:36:35 crc kubenswrapper[4735]: E0317 02:36:35.079630 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:36:49 crc kubenswrapper[4735]: I0317 02:36:49.073368 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:36:49 crc kubenswrapper[4735]: E0317 02:36:49.074191 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:37:03 crc kubenswrapper[4735]: I0317 02:37:03.110187 4735 scope.go:117] "RemoveContainer" containerID="2728457a42f3eaba04db4e785fc13a0c47bbbbca45164c90257ff0b36570a285" Mar 17 02:37:04 crc kubenswrapper[4735]: I0317 02:37:04.073123 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:37:04 crc kubenswrapper[4735]: E0317 02:37:04.073967 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:37:17 crc kubenswrapper[4735]: I0317 02:37:17.073283 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:37:17 crc kubenswrapper[4735]: E0317 02:37:17.074109 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:37:29 crc kubenswrapper[4735]: I0317 02:37:29.073219 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:37:29 crc kubenswrapper[4735]: E0317 02:37:29.074056 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:37:41 crc kubenswrapper[4735]: I0317 02:37:41.073339 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:37:41 crc kubenswrapper[4735]: E0317 02:37:41.074008 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:37:52 crc kubenswrapper[4735]: I0317 02:37:52.073732 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:37:52 crc kubenswrapper[4735]: E0317 02:37:52.074881 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.162070 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561918-dpgdb"] Mar 17 02:38:00 crc kubenswrapper[4735]: E0317 02:38:00.163005 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="extract-utilities" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163021 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="extract-utilities" Mar 17 02:38:00 crc kubenswrapper[4735]: E0317 02:38:00.163046 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a573ac-bdf0-41a5-847c-513987633028" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163054 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a573ac-bdf0-41a5-847c-513987633028" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4735]: E0317 02:38:00.163088 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="extract-content" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163096 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="extract-content" Mar 17 02:38:00 crc kubenswrapper[4735]: E0317 02:38:00.163112 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163120 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163338 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d652b4-cf32-4f7e-af9f-e3371c0381a7" containerName="registry-server" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.163356 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a573ac-bdf0-41a5-847c-513987633028" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.164105 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.167171 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.169294 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.169343 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.172980 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-dpgdb"] Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.307431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggl9\" (UniqueName: \"kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9\") pod \"auto-csr-approver-29561918-dpgdb\" (UID: \"bd823da5-0bad-446f-8fd6-50ed466e7d0c\") " pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.409358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggl9\" (UniqueName: \"kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9\") pod \"auto-csr-approver-29561918-dpgdb\" (UID: \"bd823da5-0bad-446f-8fd6-50ed466e7d0c\") " pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.426784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggl9\" (UniqueName: \"kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9\") pod \"auto-csr-approver-29561918-dpgdb\" (UID: \"bd823da5-0bad-446f-8fd6-50ed466e7d0c\") " pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:00 crc kubenswrapper[4735]: I0317 02:38:00.514984 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:01 crc kubenswrapper[4735]: I0317 02:38:01.463583 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-dpgdb"] Mar 17 02:38:02 crc kubenswrapper[4735]: I0317 02:38:02.360153 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" event={"ID":"bd823da5-0bad-446f-8fd6-50ed466e7d0c","Type":"ContainerStarted","Data":"4f57ea81e272b9aa90805b4fb0c3f96a102e8d8d010ee2eceb564a28918db598"} Mar 17 02:38:03 crc kubenswrapper[4735]: I0317 02:38:03.373570 4735 generic.go:334] "Generic (PLEG): container finished" podID="bd823da5-0bad-446f-8fd6-50ed466e7d0c" containerID="b9ff558c5e47780516e41e5ade25f3ab698d1bd5eec6358a33cc36f1e2493ff6" exitCode=0 Mar 17 02:38:03 crc kubenswrapper[4735]: I0317 02:38:03.373769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" event={"ID":"bd823da5-0bad-446f-8fd6-50ed466e7d0c","Type":"ContainerDied","Data":"b9ff558c5e47780516e41e5ade25f3ab698d1bd5eec6358a33cc36f1e2493ff6"} Mar 17 02:38:04 crc kubenswrapper[4735]: I0317 02:38:04.075191 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:38:04 crc kubenswrapper[4735]: E0317 02:38:04.075504 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:38:04 crc kubenswrapper[4735]: I0317 02:38:04.813911 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:04 crc kubenswrapper[4735]: I0317 02:38:04.906100 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ggl9\" (UniqueName: \"kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9\") pod \"bd823da5-0bad-446f-8fd6-50ed466e7d0c\" (UID: \"bd823da5-0bad-446f-8fd6-50ed466e7d0c\") " Mar 17 02:38:04 crc kubenswrapper[4735]: I0317 02:38:04.923210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9" (OuterVolumeSpecName: "kube-api-access-4ggl9") pod "bd823da5-0bad-446f-8fd6-50ed466e7d0c" (UID: "bd823da5-0bad-446f-8fd6-50ed466e7d0c"). InnerVolumeSpecName "kube-api-access-4ggl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.008629 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ggl9\" (UniqueName: \"kubernetes.io/projected/bd823da5-0bad-446f-8fd6-50ed466e7d0c-kube-api-access-4ggl9\") on node \"crc\" DevicePath \"\"" Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.393457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" event={"ID":"bd823da5-0bad-446f-8fd6-50ed466e7d0c","Type":"ContainerDied","Data":"4f57ea81e272b9aa90805b4fb0c3f96a102e8d8d010ee2eceb564a28918db598"} Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.393496 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f57ea81e272b9aa90805b4fb0c3f96a102e8d8d010ee2eceb564a28918db598" Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.393504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-dpgdb" Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.930179 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-6lzxg"] Mar 17 02:38:05 crc kubenswrapper[4735]: I0317 02:38:05.946985 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-6lzxg"] Mar 17 02:38:07 crc kubenswrapper[4735]: I0317 02:38:07.086625 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3aa9ac-fe44-4755-85f6-e430d301286d" path="/var/lib/kubelet/pods/0c3aa9ac-fe44-4755-85f6-e430d301286d/volumes" Mar 17 02:38:16 crc kubenswrapper[4735]: I0317 02:38:16.073213 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:38:16 crc kubenswrapper[4735]: E0317 02:38:16.074775 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:38:29 crc kubenswrapper[4735]: I0317 02:38:29.073355 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:38:29 crc kubenswrapper[4735]: E0317 02:38:29.074164 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:38:42 crc kubenswrapper[4735]: I0317 02:38:42.073546 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:38:42 crc kubenswrapper[4735]: E0317 02:38:42.074643 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:38:54 crc kubenswrapper[4735]: I0317 02:38:54.073891 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:38:54 crc kubenswrapper[4735]: E0317 02:38:54.074910 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:39:03 crc kubenswrapper[4735]: I0317 02:39:03.276122 4735 scope.go:117] "RemoveContainer" containerID="c5e966f95b8b2609776da84a2ebf88be0666c6826e8918c1c565b4145c689d65" Mar 17 02:39:06 crc kubenswrapper[4735]: I0317 02:39:06.073752 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:39:06 crc kubenswrapper[4735]: E0317 02:39:06.075438 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:39:17 crc kubenswrapper[4735]: I0317 02:39:17.074208 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:39:17 crc kubenswrapper[4735]: E0317 02:39:17.075000 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:39:28 crc kubenswrapper[4735]: I0317 02:39:28.073449 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:39:28 crc kubenswrapper[4735]: E0317 02:39:28.074521 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:39:42 crc kubenswrapper[4735]: I0317 02:39:42.074183 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:39:42 crc kubenswrapper[4735]: E0317 02:39:42.075249 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:39:57 crc kubenswrapper[4735]: I0317 02:39:57.074080 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:39:58 crc kubenswrapper[4735]: I0317 02:39:58.070136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b"} Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.149388 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561920-qwnkp"] Mar 17 02:40:00 crc kubenswrapper[4735]: E0317 02:40:00.150165 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd823da5-0bad-446f-8fd6-50ed466e7d0c" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.150176 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd823da5-0bad-446f-8fd6-50ed466e7d0c" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.150384 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd823da5-0bad-446f-8fd6-50ed466e7d0c" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.150955 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.156202 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.156499 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.157296 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.164807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-qwnkp"] Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.240510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9x49\" (UniqueName: \"kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49\") pod \"auto-csr-approver-29561920-qwnkp\" (UID: \"231f526e-32a8-4e63-81d2-63b1a1670486\") " pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.342291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9x49\" (UniqueName: \"kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49\") pod \"auto-csr-approver-29561920-qwnkp\" (UID: \"231f526e-32a8-4e63-81d2-63b1a1670486\") " pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.366190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9x49\" (UniqueName: \"kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49\") pod \"auto-csr-approver-29561920-qwnkp\" (UID: \"231f526e-32a8-4e63-81d2-63b1a1670486\") " pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:00 crc kubenswrapper[4735]: I0317 02:40:00.477610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:01 crc kubenswrapper[4735]: I0317 02:40:01.481173 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-qwnkp"] Mar 17 02:40:02 crc kubenswrapper[4735]: I0317 02:40:02.099706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" event={"ID":"231f526e-32a8-4e63-81d2-63b1a1670486","Type":"ContainerStarted","Data":"9ffea7036e5804372faf9b82ee85507907ae6b652769a9755d9c47f7d659a365"} Mar 17 02:40:03 crc kubenswrapper[4735]: I0317 02:40:03.108671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" event={"ID":"231f526e-32a8-4e63-81d2-63b1a1670486","Type":"ContainerStarted","Data":"a7f47cb080c15eebec91aaffee47f16487a3d4added267cd71deecae11e437bd"} Mar 17 02:40:03 crc kubenswrapper[4735]: I0317 02:40:03.130770 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" podStartSLOduration=1.936407541 podStartE2EDuration="3.130750636s" podCreationTimestamp="2026-03-17 02:40:00 +0000 UTC" firstStartedPulling="2026-03-17 02:40:01.489232977 +0000 UTC m=+5427.121465945" lastFinishedPulling="2026-03-17 02:40:02.683576062 +0000 UTC m=+5428.315809040" observedRunningTime="2026-03-17 02:40:03.124091196 +0000 UTC m=+5428.756324174" watchObservedRunningTime="2026-03-17 02:40:03.130750636 +0000 UTC m=+5428.762983624" Mar 17 02:40:04 crc kubenswrapper[4735]: I0317 02:40:04.120277 4735 generic.go:334] "Generic (PLEG): container finished" podID="231f526e-32a8-4e63-81d2-63b1a1670486" containerID="a7f47cb080c15eebec91aaffee47f16487a3d4added267cd71deecae11e437bd" exitCode=0 Mar 17 02:40:04 crc kubenswrapper[4735]: I0317 02:40:04.120356 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" event={"ID":"231f526e-32a8-4e63-81d2-63b1a1670486","Type":"ContainerDied","Data":"a7f47cb080c15eebec91aaffee47f16487a3d4added267cd71deecae11e437bd"} Mar 17 02:40:05 crc kubenswrapper[4735]: I0317 02:40:05.599916 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:05 crc kubenswrapper[4735]: I0317 02:40:05.763912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9x49\" (UniqueName: \"kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49\") pod \"231f526e-32a8-4e63-81d2-63b1a1670486\" (UID: \"231f526e-32a8-4e63-81d2-63b1a1670486\") " Mar 17 02:40:05 crc kubenswrapper[4735]: I0317 02:40:05.770509 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49" (OuterVolumeSpecName: "kube-api-access-p9x49") pod "231f526e-32a8-4e63-81d2-63b1a1670486" (UID: "231f526e-32a8-4e63-81d2-63b1a1670486"). InnerVolumeSpecName "kube-api-access-p9x49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:40:05 crc kubenswrapper[4735]: I0317 02:40:05.867956 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9x49\" (UniqueName: \"kubernetes.io/projected/231f526e-32a8-4e63-81d2-63b1a1670486-kube-api-access-p9x49\") on node \"crc\" DevicePath \"\"" Mar 17 02:40:06 crc kubenswrapper[4735]: I0317 02:40:06.141789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" event={"ID":"231f526e-32a8-4e63-81d2-63b1a1670486","Type":"ContainerDied","Data":"9ffea7036e5804372faf9b82ee85507907ae6b652769a9755d9c47f7d659a365"} Mar 17 02:40:06 crc kubenswrapper[4735]: I0317 02:40:06.141832 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ffea7036e5804372faf9b82ee85507907ae6b652769a9755d9c47f7d659a365" Mar 17 02:40:06 crc kubenswrapper[4735]: I0317 02:40:06.141869 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-qwnkp" Mar 17 02:40:06 crc kubenswrapper[4735]: I0317 02:40:06.218624 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-q26dv"] Mar 17 02:40:06 crc kubenswrapper[4735]: I0317 02:40:06.226585 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-q26dv"] Mar 17 02:40:07 crc kubenswrapper[4735]: I0317 02:40:07.084302 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba73d723-c937-438b-9ad8-3fc8baef218c" path="/var/lib/kubelet/pods/ba73d723-c937-438b-9ad8-3fc8baef218c/volumes" Mar 17 02:41:03 crc kubenswrapper[4735]: I0317 02:41:03.396516 4735 scope.go:117] "RemoveContainer" containerID="d0205944742ee5fa5f479ba56b720aaf040d5b61bfb09ad2c27dfe3d5ed87dd3" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.602178 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:15 crc kubenswrapper[4735]: E0317 02:41:15.603198 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f526e-32a8-4e63-81d2-63b1a1670486" containerName="oc" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.603214 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f526e-32a8-4e63-81d2-63b1a1670486" containerName="oc" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.603424 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f526e-32a8-4e63-81d2-63b1a1670486" containerName="oc" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.605123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.618697 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.756135 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2xh\" (UniqueName: \"kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.756248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.756398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.858901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2xh\" (UniqueName: \"kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.859255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.859278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.859665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.859770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.884145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2xh\" (UniqueName: \"kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh\") pod \"redhat-marketplace-wqzkg\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:15 crc kubenswrapper[4735]: I0317 02:41:15.929964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:16 crc kubenswrapper[4735]: I0317 02:41:16.521754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:16 crc kubenswrapper[4735]: I0317 02:41:16.980279 4735 generic.go:334] "Generic (PLEG): container finished" podID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerID="9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a" exitCode=0 Mar 17 02:41:16 crc kubenswrapper[4735]: I0317 02:41:16.980355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerDied","Data":"9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a"} Mar 17 02:41:16 crc kubenswrapper[4735]: I0317 02:41:16.980615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerStarted","Data":"f31fbadd2691be4e9e086db6939dea38c7e0b3226921559bf985435d4f1915f3"} Mar 17 02:41:16 crc kubenswrapper[4735]: I0317 02:41:16.982935 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:41:17 crc kubenswrapper[4735]: I0317 02:41:17.989906 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerStarted","Data":"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a"} Mar 17 02:41:18 crc kubenswrapper[4735]: I0317 02:41:18.998820 4735 generic.go:334] "Generic (PLEG): container finished" podID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerID="a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a" exitCode=0 Mar 17 02:41:18 crc kubenswrapper[4735]: I0317 02:41:18.999167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerDied","Data":"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a"} Mar 17 02:41:20 crc kubenswrapper[4735]: I0317 02:41:20.020419 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerStarted","Data":"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb"} Mar 17 02:41:20 crc kubenswrapper[4735]: I0317 02:41:20.047811 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wqzkg" podStartSLOduration=2.61637687 podStartE2EDuration="5.047789906s" podCreationTimestamp="2026-03-17 02:41:15 +0000 UTC" firstStartedPulling="2026-03-17 02:41:16.98236419 +0000 UTC m=+5502.614597158" lastFinishedPulling="2026-03-17 02:41:19.413777206 +0000 UTC m=+5505.046010194" observedRunningTime="2026-03-17 02:41:20.038000171 +0000 UTC m=+5505.670233149" watchObservedRunningTime="2026-03-17 02:41:20.047789906 +0000 UTC m=+5505.680022884" Mar 17 02:41:25 crc kubenswrapper[4735]: I0317 02:41:25.930050 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:25 crc kubenswrapper[4735]: I0317 02:41:25.930544 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:25 crc kubenswrapper[4735]: I0317 02:41:25.995999 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:26 crc kubenswrapper[4735]: I0317 02:41:26.140678 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:26 crc kubenswrapper[4735]: I0317 02:41:26.270091 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.094623 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wqzkg" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="registry-server" containerID="cri-o://5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb" gracePeriod=2 Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.573789 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.765689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content\") pod \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.765989 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2xh\" (UniqueName: \"kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh\") pod \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.766031 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities\") pod \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\" (UID: \"f5f3d547-5cb6-4d1e-9608-5d4d166171cf\") " Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.766946 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities" (OuterVolumeSpecName: "utilities") pod "f5f3d547-5cb6-4d1e-9608-5d4d166171cf" (UID: "f5f3d547-5cb6-4d1e-9608-5d4d166171cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.768615 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.773216 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh" (OuterVolumeSpecName: "kube-api-access-xz2xh") pod "f5f3d547-5cb6-4d1e-9608-5d4d166171cf" (UID: "f5f3d547-5cb6-4d1e-9608-5d4d166171cf"). InnerVolumeSpecName "kube-api-access-xz2xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.797756 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5f3d547-5cb6-4d1e-9608-5d4d166171cf" (UID: "f5f3d547-5cb6-4d1e-9608-5d4d166171cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.869448 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2xh\" (UniqueName: \"kubernetes.io/projected/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-kube-api-access-xz2xh\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:28 crc kubenswrapper[4735]: I0317 02:41:28.869482 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f3d547-5cb6-4d1e-9608-5d4d166171cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.120147 4735 generic.go:334] "Generic (PLEG): container finished" podID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerID="5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb" exitCode=0 Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.120309 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqzkg" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.120329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerDied","Data":"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb"} Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.121448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqzkg" event={"ID":"f5f3d547-5cb6-4d1e-9608-5d4d166171cf","Type":"ContainerDied","Data":"f31fbadd2691be4e9e086db6939dea38c7e0b3226921559bf985435d4f1915f3"} Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.121475 4735 scope.go:117] "RemoveContainer" containerID="5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.142474 4735 scope.go:117] "RemoveContainer" containerID="a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.163568 4735 scope.go:117] "RemoveContainer" containerID="9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.169713 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.177289 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqzkg"] Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.203268 4735 scope.go:117] "RemoveContainer" containerID="5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb" Mar 17 02:41:29 crc kubenswrapper[4735]: E0317 02:41:29.204954 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb\": container with ID starting with 5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb not found: ID does not exist" containerID="5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.204988 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb"} err="failed to get container status \"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb\": rpc error: code = NotFound desc = could not find container \"5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb\": container with ID starting with 5d2316e70fab674990fd26b904a7243effe0a8b34252a7493e4947787e863bcb not found: ID does not exist" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.205011 4735 scope.go:117] "RemoveContainer" containerID="a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a" Mar 17 02:41:29 crc kubenswrapper[4735]: E0317 02:41:29.206383 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a\": container with ID starting with a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a not found: ID does not exist" containerID="a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.206454 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a"} err="failed to get container status \"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a\": rpc error: code = NotFound desc = could not find container \"a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a\": container with ID starting with a0919b1c081ac8809cc293e19926122ce36528fc7225bb199bf9c935f725810a not found: ID does not exist" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.206493 4735 scope.go:117] "RemoveContainer" containerID="9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a" Mar 17 02:41:29 crc kubenswrapper[4735]: E0317 02:41:29.207074 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a\": container with ID starting with 9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a not found: ID does not exist" containerID="9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a" Mar 17 02:41:29 crc kubenswrapper[4735]: I0317 02:41:29.207109 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a"} err="failed to get container status \"9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a\": rpc error: code = NotFound desc = could not find container \"9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a\": container with ID starting with 9e9b27ae66f0d27298ce03cfc2b1eee0576d01fc9f7e546f92c7cd1c32d6d60a not found: ID does not exist" Mar 17 02:41:31 crc kubenswrapper[4735]: I0317 02:41:31.084341 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" path="/var/lib/kubelet/pods/f5f3d547-5cb6-4d1e-9608-5d4d166171cf/volumes" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.173803 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561922-b8gg4"] Mar 17 02:42:00 crc kubenswrapper[4735]: E0317 02:42:00.174806 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="extract-content" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.174821 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="extract-content" Mar 17 02:42:00 crc kubenswrapper[4735]: E0317 02:42:00.174833 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="extract-utilities" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.174839 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="extract-utilities" Mar 17 02:42:00 crc kubenswrapper[4735]: E0317 02:42:00.174874 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.174880 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.175096 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f3d547-5cb6-4d1e-9608-5d4d166171cf" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.175886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.180793 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.180907 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.181909 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.183279 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-b8gg4"] Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.255307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9w87\" (UniqueName: \"kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87\") pod \"auto-csr-approver-29561922-b8gg4\" (UID: \"dbaedfe4-e290-453e-9bf4-850197b1a1ca\") " pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.357466 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9w87\" (UniqueName: \"kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87\") pod \"auto-csr-approver-29561922-b8gg4\" (UID: \"dbaedfe4-e290-453e-9bf4-850197b1a1ca\") " pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.380972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9w87\" (UniqueName: \"kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87\") pod \"auto-csr-approver-29561922-b8gg4\" (UID: \"dbaedfe4-e290-453e-9bf4-850197b1a1ca\") " pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.497743 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:00 crc kubenswrapper[4735]: I0317 02:42:00.947653 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-b8gg4"] Mar 17 02:42:01 crc kubenswrapper[4735]: I0317 02:42:01.482102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" event={"ID":"dbaedfe4-e290-453e-9bf4-850197b1a1ca","Type":"ContainerStarted","Data":"e43b1b106580f5946dac2ae742ccc1574300a3039a528905884e5ae3ca0ecd10"} Mar 17 02:42:02 crc kubenswrapper[4735]: I0317 02:42:02.490906 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" event={"ID":"dbaedfe4-e290-453e-9bf4-850197b1a1ca","Type":"ContainerStarted","Data":"9f8839a50270ceb44e20ca92fe5d2d241dcdb3af88ff98734c963efa54f5d5bf"} Mar 17 02:42:02 crc kubenswrapper[4735]: I0317 02:42:02.509596 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" podStartSLOduration=1.60290227 podStartE2EDuration="2.509568559s" podCreationTimestamp="2026-03-17 02:42:00 +0000 UTC" firstStartedPulling="2026-03-17 02:42:00.975739294 +0000 UTC m=+5546.607972282" lastFinishedPulling="2026-03-17 02:42:01.882405563 +0000 UTC m=+5547.514638571" observedRunningTime="2026-03-17 02:42:02.502621413 +0000 UTC m=+5548.134854391" watchObservedRunningTime="2026-03-17 02:42:02.509568559 +0000 UTC m=+5548.141801557" Mar 17 02:42:03 crc kubenswrapper[4735]: I0317 02:42:03.504720 4735 generic.go:334] "Generic (PLEG): container finished" podID="dbaedfe4-e290-453e-9bf4-850197b1a1ca" containerID="9f8839a50270ceb44e20ca92fe5d2d241dcdb3af88ff98734c963efa54f5d5bf" exitCode=0 Mar 17 02:42:03 crc kubenswrapper[4735]: I0317 02:42:03.504901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" event={"ID":"dbaedfe4-e290-453e-9bf4-850197b1a1ca","Type":"ContainerDied","Data":"9f8839a50270ceb44e20ca92fe5d2d241dcdb3af88ff98734c963efa54f5d5bf"} Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.138567 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.266515 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9w87\" (UniqueName: \"kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87\") pod \"dbaedfe4-e290-453e-9bf4-850197b1a1ca\" (UID: \"dbaedfe4-e290-453e-9bf4-850197b1a1ca\") " Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.279758 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87" (OuterVolumeSpecName: "kube-api-access-p9w87") pod "dbaedfe4-e290-453e-9bf4-850197b1a1ca" (UID: "dbaedfe4-e290-453e-9bf4-850197b1a1ca"). InnerVolumeSpecName "kube-api-access-p9w87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.368349 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9w87\" (UniqueName: \"kubernetes.io/projected/dbaedfe4-e290-453e-9bf4-850197b1a1ca-kube-api-access-p9w87\") on node \"crc\" DevicePath \"\"" Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.526966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" event={"ID":"dbaedfe4-e290-453e-9bf4-850197b1a1ca","Type":"ContainerDied","Data":"e43b1b106580f5946dac2ae742ccc1574300a3039a528905884e5ae3ca0ecd10"} Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.526999 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e43b1b106580f5946dac2ae742ccc1574300a3039a528905884e5ae3ca0ecd10" Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.527051 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-b8gg4" Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.602583 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-frgcv"] Mar 17 02:42:05 crc kubenswrapper[4735]: I0317 02:42:05.613061 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-frgcv"] Mar 17 02:42:07 crc kubenswrapper[4735]: I0317 02:42:07.087763 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a573ac-bdf0-41a5-847c-513987633028" path="/var/lib/kubelet/pods/00a573ac-bdf0-41a5-847c-513987633028/volumes" Mar 17 02:42:12 crc kubenswrapper[4735]: I0317 02:42:12.606527 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:42:12 crc kubenswrapper[4735]: I0317 02:42:12.607186 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:42:42 crc kubenswrapper[4735]: I0317 02:42:42.606450 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:42:42 crc kubenswrapper[4735]: I0317 02:42:42.607076 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:42:47 crc kubenswrapper[4735]: I0317 02:42:47.930132 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:42:47 crc kubenswrapper[4735]: E0317 02:42:47.931303 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaedfe4-e290-453e-9bf4-850197b1a1ca" containerName="oc" Mar 17 02:42:47 crc kubenswrapper[4735]: I0317 02:42:47.931325 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaedfe4-e290-453e-9bf4-850197b1a1ca" containerName="oc" Mar 17 02:42:47 crc kubenswrapper[4735]: I0317 02:42:47.931634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbaedfe4-e290-453e-9bf4-850197b1a1ca" containerName="oc" Mar 17 02:42:47 crc kubenswrapper[4735]: I0317 02:42:47.933990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:47 crc kubenswrapper[4735]: I0317 02:42:47.944432 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.038945 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.039237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.039338 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927q6\" (UniqueName: \"kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.141041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.141097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.141221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927q6\" (UniqueName: \"kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.141778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.141780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.160626 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927q6\" (UniqueName: \"kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6\") pod \"certified-operators-cqwdr\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.258075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.746937 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.980262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerStarted","Data":"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a"} Mar 17 02:42:48 crc kubenswrapper[4735]: I0317 02:42:48.980529 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerStarted","Data":"d1f7c7ede2f177d060bda939ebf9030ac997eec4533273126490c6782dec7819"} Mar 17 02:42:49 crc kubenswrapper[4735]: I0317 02:42:49.992654 4735 generic.go:334] "Generic (PLEG): container finished" podID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerID="218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a" exitCode=0 Mar 17 02:42:49 crc kubenswrapper[4735]: I0317 02:42:49.993006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerDied","Data":"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a"} Mar 17 02:42:49 crc kubenswrapper[4735]: I0317 02:42:49.993036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerStarted","Data":"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d"} Mar 17 02:42:52 crc kubenswrapper[4735]: I0317 02:42:52.016457 4735 generic.go:334] "Generic (PLEG): container finished" podID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerID="6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d" exitCode=0 Mar 17 02:42:52 crc kubenswrapper[4735]: I0317 02:42:52.016672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerDied","Data":"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d"} Mar 17 02:42:53 crc kubenswrapper[4735]: I0317 02:42:53.042068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerStarted","Data":"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa"} Mar 17 02:42:53 crc kubenswrapper[4735]: I0317 02:42:53.066388 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cqwdr" podStartSLOduration=2.6036952060000003 podStartE2EDuration="6.066365531s" podCreationTimestamp="2026-03-17 02:42:47 +0000 UTC" firstStartedPulling="2026-03-17 02:42:48.981737955 +0000 UTC m=+5594.613970933" lastFinishedPulling="2026-03-17 02:42:52.44440828 +0000 UTC m=+5598.076641258" observedRunningTime="2026-03-17 02:42:53.061113606 +0000 UTC m=+5598.693346584" watchObservedRunningTime="2026-03-17 02:42:53.066365531 +0000 UTC m=+5598.698598519" Mar 17 02:42:58 crc kubenswrapper[4735]: I0317 02:42:58.258306 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:58 crc kubenswrapper[4735]: I0317 02:42:58.259146 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:42:59 crc kubenswrapper[4735]: I0317 02:42:59.312369 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cqwdr" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="registry-server" probeResult="failure" output=< Mar 17 02:42:59 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:42:59 crc kubenswrapper[4735]: > Mar 17 02:43:03 crc kubenswrapper[4735]: I0317 02:43:03.602447 4735 scope.go:117] "RemoveContainer" containerID="1ac628716a0faa23d402d3e05915b9835641e2c2765fe0cfda168639258d5cab" Mar 17 02:43:08 crc kubenswrapper[4735]: I0317 02:43:08.311962 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:43:08 crc kubenswrapper[4735]: I0317 02:43:08.371584 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:43:08 crc kubenswrapper[4735]: I0317 02:43:08.556916 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:43:10 crc kubenswrapper[4735]: I0317 02:43:10.209918 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cqwdr" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="registry-server" containerID="cri-o://7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa" gracePeriod=2 Mar 17 02:43:10 crc kubenswrapper[4735]: I0317 02:43:10.939701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.103214 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927q6\" (UniqueName: \"kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6\") pod \"f3a743da-e28d-4660-b9dc-85e4d47e3443\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.103316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content\") pod \"f3a743da-e28d-4660-b9dc-85e4d47e3443\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.103346 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities\") pod \"f3a743da-e28d-4660-b9dc-85e4d47e3443\" (UID: \"f3a743da-e28d-4660-b9dc-85e4d47e3443\") " Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.104300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities" (OuterVolumeSpecName: "utilities") pod "f3a743da-e28d-4660-b9dc-85e4d47e3443" (UID: "f3a743da-e28d-4660-b9dc-85e4d47e3443"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.123714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6" (OuterVolumeSpecName: "kube-api-access-927q6") pod "f3a743da-e28d-4660-b9dc-85e4d47e3443" (UID: "f3a743da-e28d-4660-b9dc-85e4d47e3443"). InnerVolumeSpecName "kube-api-access-927q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.152949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3a743da-e28d-4660-b9dc-85e4d47e3443" (UID: "f3a743da-e28d-4660-b9dc-85e4d47e3443"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.206044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-927q6\" (UniqueName: \"kubernetes.io/projected/f3a743da-e28d-4660-b9dc-85e4d47e3443-kube-api-access-927q6\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.206121 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.206131 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a743da-e28d-4660-b9dc-85e4d47e3443-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.218978 4735 generic.go:334] "Generic (PLEG): container finished" podID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerID="7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa" exitCode=0 Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.219012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerDied","Data":"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa"} Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.219039 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqwdr" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.219065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqwdr" event={"ID":"f3a743da-e28d-4660-b9dc-85e4d47e3443","Type":"ContainerDied","Data":"d1f7c7ede2f177d060bda939ebf9030ac997eec4533273126490c6782dec7819"} Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.219087 4735 scope.go:117] "RemoveContainer" containerID="7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.242680 4735 scope.go:117] "RemoveContainer" containerID="6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.254214 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.263964 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cqwdr"] Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.280813 4735 scope.go:117] "RemoveContainer" containerID="218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.309559 4735 scope.go:117] "RemoveContainer" containerID="7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa" Mar 17 02:43:11 crc kubenswrapper[4735]: E0317 02:43:11.309851 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa\": container with ID starting with 7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa not found: ID does not exist" containerID="7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.309939 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa"} err="failed to get container status \"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa\": rpc error: code = NotFound desc = could not find container \"7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa\": container with ID starting with 7c311f43e5a07b143f01471069f76417b5f7796b385de9fd0f9d3cf8c67a95aa not found: ID does not exist" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.309960 4735 scope.go:117] "RemoveContainer" containerID="6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d" Mar 17 02:43:11 crc kubenswrapper[4735]: E0317 02:43:11.310298 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d\": container with ID starting with 6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d not found: ID does not exist" containerID="6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.310361 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d"} err="failed to get container status \"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d\": rpc error: code = NotFound desc = could not find container \"6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d\": container with ID starting with 6e8e6736de00df3f79e8b025b4a5925864d4f9e5e29eb5a5b45d88108149718d not found: ID does not exist" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.310389 4735 scope.go:117] "RemoveContainer" containerID="218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a" Mar 17 02:43:11 crc kubenswrapper[4735]: E0317 02:43:11.310688 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a\": container with ID starting with 218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a not found: ID does not exist" containerID="218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a" Mar 17 02:43:11 crc kubenswrapper[4735]: I0317 02:43:11.310705 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a"} err="failed to get container status \"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a\": rpc error: code = NotFound desc = could not find container \"218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a\": container with ID starting with 218933f18117887214cc3ec45f6c24390fb755dc15c0cda92017d802dc4e8d6a not found: ID does not exist" Mar 17 02:43:12 crc kubenswrapper[4735]: I0317 02:43:12.606448 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:43:12 crc kubenswrapper[4735]: I0317 02:43:12.606819 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:43:12 crc kubenswrapper[4735]: I0317 02:43:12.606908 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:43:12 crc kubenswrapper[4735]: I0317 02:43:12.607852 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:43:12 crc kubenswrapper[4735]: I0317 02:43:12.607979 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b" gracePeriod=600 Mar 17 02:43:13 crc kubenswrapper[4735]: I0317 02:43:13.083557 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" path="/var/lib/kubelet/pods/f3a743da-e28d-4660-b9dc-85e4d47e3443/volumes" Mar 17 02:43:13 crc kubenswrapper[4735]: I0317 02:43:13.298363 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b" exitCode=0 Mar 17 02:43:13 crc kubenswrapper[4735]: I0317 02:43:13.298420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b"} Mar 17 02:43:13 crc kubenswrapper[4735]: I0317 02:43:13.298452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb"} Mar 17 02:43:13 crc kubenswrapper[4735]: I0317 02:43:13.298472 4735 scope.go:117] "RemoveContainer" containerID="910edeea795e0e14114babe396c93b6c8085ee3b32382c130ae92763a2b66b41" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.885693 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:43:53 crc kubenswrapper[4735]: E0317 02:43:53.886600 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="registry-server" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.886615 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="registry-server" Mar 17 02:43:53 crc kubenswrapper[4735]: E0317 02:43:53.886642 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="extract-content" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.886652 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="extract-content" Mar 17 02:43:53 crc kubenswrapper[4735]: E0317 02:43:53.886670 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="extract-utilities" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.886678 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="extract-utilities" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.886933 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a743da-e28d-4660-b9dc-85e4d47e3443" containerName="registry-server" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.888510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.908321 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.990117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.990593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:53 crc kubenswrapper[4735]: I0317 02:43:53.991307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2xm\" (UniqueName: \"kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.093385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2xm\" (UniqueName: \"kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.094180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.094229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.094794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.094838 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.320576 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2xm\" (UniqueName: \"kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm\") pod \"community-operators-lxvvv\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:54 crc kubenswrapper[4735]: I0317 02:43:54.510035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:43:55 crc kubenswrapper[4735]: I0317 02:43:55.038651 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:43:55 crc kubenswrapper[4735]: I0317 02:43:55.696802 4735 generic.go:334] "Generic (PLEG): container finished" podID="2480f397-4fc1-4554-b99e-b552353ad396" containerID="cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d" exitCode=0 Mar 17 02:43:55 crc kubenswrapper[4735]: I0317 02:43:55.697181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerDied","Data":"cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d"} Mar 17 02:43:55 crc kubenswrapper[4735]: I0317 02:43:55.697211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerStarted","Data":"cb2f5f693ae448aa9bd856f6e0d3b1a85a9f1496deeea3826c6db82eccc3bb2b"} Mar 17 02:43:56 crc kubenswrapper[4735]: I0317 02:43:56.705942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerStarted","Data":"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c"} Mar 17 02:43:58 crc kubenswrapper[4735]: I0317 02:43:58.723848 4735 generic.go:334] "Generic (PLEG): container finished" podID="2480f397-4fc1-4554-b99e-b552353ad396" containerID="dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c" exitCode=0 Mar 17 02:43:58 crc kubenswrapper[4735]: I0317 02:43:58.723918 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerDied","Data":"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c"} Mar 17 02:43:59 crc kubenswrapper[4735]: I0317 02:43:59.755426 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerStarted","Data":"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5"} Mar 17 02:43:59 crc kubenswrapper[4735]: I0317 02:43:59.784265 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lxvvv" podStartSLOduration=3.132995878 podStartE2EDuration="6.784246982s" podCreationTimestamp="2026-03-17 02:43:53 +0000 UTC" firstStartedPulling="2026-03-17 02:43:55.699110504 +0000 UTC m=+5661.331343482" lastFinishedPulling="2026-03-17 02:43:59.350361618 +0000 UTC m=+5664.982594586" observedRunningTime="2026-03-17 02:43:59.781042166 +0000 UTC m=+5665.413275154" watchObservedRunningTime="2026-03-17 02:43:59.784246982 +0000 UTC m=+5665.416479960" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.152977 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561924-z2wzc"] Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.154787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.156518 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.164645 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-z2wzc"] Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.165893 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.166413 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.309903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwrbj\" (UniqueName: \"kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj\") pod \"auto-csr-approver-29561924-z2wzc\" (UID: \"287cfd38-e820-494c-9be5-a86fd369b3f8\") " pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.411871 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwrbj\" (UniqueName: \"kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj\") pod \"auto-csr-approver-29561924-z2wzc\" (UID: \"287cfd38-e820-494c-9be5-a86fd369b3f8\") " pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.432557 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwrbj\" (UniqueName: \"kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj\") pod \"auto-csr-approver-29561924-z2wzc\" (UID: \"287cfd38-e820-494c-9be5-a86fd369b3f8\") " pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.477876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:00 crc kubenswrapper[4735]: I0317 02:44:00.939635 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-z2wzc"] Mar 17 02:44:01 crc kubenswrapper[4735]: I0317 02:44:01.776540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" event={"ID":"287cfd38-e820-494c-9be5-a86fd369b3f8","Type":"ContainerStarted","Data":"fbef3412b09e7b7de861ed722f6c128bfbe810c59f9d90f22e48acdf5bf48f02"} Mar 17 02:44:02 crc kubenswrapper[4735]: I0317 02:44:02.794843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" event={"ID":"287cfd38-e820-494c-9be5-a86fd369b3f8","Type":"ContainerStarted","Data":"e0b57652dbcd18847a9b5678198cedc5e373ccf6d142fd50433753216d60f5b3"} Mar 17 02:44:02 crc kubenswrapper[4735]: I0317 02:44:02.829404 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" podStartSLOduration=1.718356818 podStartE2EDuration="2.829382023s" podCreationTimestamp="2026-03-17 02:44:00 +0000 UTC" firstStartedPulling="2026-03-17 02:44:00.949956135 +0000 UTC m=+5666.582189113" lastFinishedPulling="2026-03-17 02:44:02.06098133 +0000 UTC m=+5667.693214318" observedRunningTime="2026-03-17 02:44:02.81545488 +0000 UTC m=+5668.447687858" watchObservedRunningTime="2026-03-17 02:44:02.829382023 +0000 UTC m=+5668.461615001" Mar 17 02:44:03 crc kubenswrapper[4735]: I0317 02:44:03.805589 4735 generic.go:334] "Generic (PLEG): container finished" podID="287cfd38-e820-494c-9be5-a86fd369b3f8" containerID="e0b57652dbcd18847a9b5678198cedc5e373ccf6d142fd50433753216d60f5b3" exitCode=0 Mar 17 02:44:03 crc kubenswrapper[4735]: I0317 02:44:03.805631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" event={"ID":"287cfd38-e820-494c-9be5-a86fd369b3f8","Type":"ContainerDied","Data":"e0b57652dbcd18847a9b5678198cedc5e373ccf6d142fd50433753216d60f5b3"} Mar 17 02:44:04 crc kubenswrapper[4735]: I0317 02:44:04.510905 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:04 crc kubenswrapper[4735]: I0317 02:44:04.510978 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.200239 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.313092 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwrbj\" (UniqueName: \"kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj\") pod \"287cfd38-e820-494c-9be5-a86fd369b3f8\" (UID: \"287cfd38-e820-494c-9be5-a86fd369b3f8\") " Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.321022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj" (OuterVolumeSpecName: "kube-api-access-dwrbj") pod "287cfd38-e820-494c-9be5-a86fd369b3f8" (UID: "287cfd38-e820-494c-9be5-a86fd369b3f8"). InnerVolumeSpecName "kube-api-access-dwrbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.415694 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwrbj\" (UniqueName: \"kubernetes.io/projected/287cfd38-e820-494c-9be5-a86fd369b3f8-kube-api-access-dwrbj\") on node \"crc\" DevicePath \"\"" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.571385 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lxvvv" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="registry-server" probeResult="failure" output=< Mar 17 02:44:05 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:44:05 crc kubenswrapper[4735]: > Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.828039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" event={"ID":"287cfd38-e820-494c-9be5-a86fd369b3f8","Type":"ContainerDied","Data":"fbef3412b09e7b7de861ed722f6c128bfbe810c59f9d90f22e48acdf5bf48f02"} Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.828074 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbef3412b09e7b7de861ed722f6c128bfbe810c59f9d90f22e48acdf5bf48f02" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.828141 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-z2wzc" Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.891755 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-dpgdb"] Mar 17 02:44:05 crc kubenswrapper[4735]: I0317 02:44:05.901707 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-dpgdb"] Mar 17 02:44:07 crc kubenswrapper[4735]: I0317 02:44:07.082796 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd823da5-0bad-446f-8fd6-50ed466e7d0c" path="/var/lib/kubelet/pods/bd823da5-0bad-446f-8fd6-50ed466e7d0c/volumes" Mar 17 02:44:14 crc kubenswrapper[4735]: I0317 02:44:14.579184 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:14 crc kubenswrapper[4735]: I0317 02:44:14.651108 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:14 crc kubenswrapper[4735]: I0317 02:44:14.826093 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:44:15 crc kubenswrapper[4735]: I0317 02:44:15.933476 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lxvvv" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="registry-server" containerID="cri-o://59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5" gracePeriod=2 Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.561557 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.759193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities\") pod \"2480f397-4fc1-4554-b99e-b552353ad396\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.759359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content\") pod \"2480f397-4fc1-4554-b99e-b552353ad396\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.759475 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb2xm\" (UniqueName: \"kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm\") pod \"2480f397-4fc1-4554-b99e-b552353ad396\" (UID: \"2480f397-4fc1-4554-b99e-b552353ad396\") " Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.760395 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities" (OuterVolumeSpecName: "utilities") pod "2480f397-4fc1-4554-b99e-b552353ad396" (UID: "2480f397-4fc1-4554-b99e-b552353ad396"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.769226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm" (OuterVolumeSpecName: "kube-api-access-vb2xm") pod "2480f397-4fc1-4554-b99e-b552353ad396" (UID: "2480f397-4fc1-4554-b99e-b552353ad396"). InnerVolumeSpecName "kube-api-access-vb2xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.819545 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2480f397-4fc1-4554-b99e-b552353ad396" (UID: "2480f397-4fc1-4554-b99e-b552353ad396"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.861549 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb2xm\" (UniqueName: \"kubernetes.io/projected/2480f397-4fc1-4554-b99e-b552353ad396-kube-api-access-vb2xm\") on node \"crc\" DevicePath \"\"" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.861586 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.861596 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2480f397-4fc1-4554-b99e-b552353ad396-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.941691 4735 generic.go:334] "Generic (PLEG): container finished" podID="2480f397-4fc1-4554-b99e-b552353ad396" containerID="59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5" exitCode=0 Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.941803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerDied","Data":"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5"} Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.941838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxvvv" event={"ID":"2480f397-4fc1-4554-b99e-b552353ad396","Type":"ContainerDied","Data":"cb2f5f693ae448aa9bd856f6e0d3b1a85a9f1496deeea3826c6db82eccc3bb2b"} Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.941885 4735 scope.go:117] "RemoveContainer" containerID="59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.941784 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxvvv" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.980960 4735 scope.go:117] "RemoveContainer" containerID="dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c" Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.984453 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:44:16 crc kubenswrapper[4735]: I0317 02:44:16.993729 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lxvvv"] Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.010564 4735 scope.go:117] "RemoveContainer" containerID="cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049084 4735 scope.go:117] "RemoveContainer" containerID="59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5" Mar 17 02:44:17 crc kubenswrapper[4735]: E0317 02:44:17.049418 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5\": container with ID starting with 59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5 not found: ID does not exist" containerID="59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049449 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5"} err="failed to get container status \"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5\": rpc error: code = NotFound desc = could not find container \"59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5\": container with ID starting with 59aa706002915377e5cca0ca3ae9137994d08620c941bedfc970bda8cc66cbe5 not found: ID does not exist" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049469 4735 scope.go:117] "RemoveContainer" containerID="dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c" Mar 17 02:44:17 crc kubenswrapper[4735]: E0317 02:44:17.049673 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c\": container with ID starting with dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c not found: ID does not exist" containerID="dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049696 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c"} err="failed to get container status \"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c\": rpc error: code = NotFound desc = could not find container \"dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c\": container with ID starting with dce9b47077c36a74bb3948671a47659336097279a20fb84584be167a7195734c not found: ID does not exist" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049711 4735 scope.go:117] "RemoveContainer" containerID="cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d" Mar 17 02:44:17 crc kubenswrapper[4735]: E0317 02:44:17.049948 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d\": container with ID starting with cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d not found: ID does not exist" containerID="cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.049971 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d"} err="failed to get container status \"cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d\": rpc error: code = NotFound desc = could not find container \"cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d\": container with ID starting with cd114c36d594f75522fc2505462c51d6709db57a50d47d0329c4b4613803e95d not found: ID does not exist" Mar 17 02:44:17 crc kubenswrapper[4735]: I0317 02:44:17.084316 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2480f397-4fc1-4554-b99e-b552353ad396" path="/var/lib/kubelet/pods/2480f397-4fc1-4554-b99e-b552353ad396/volumes" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.180171 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5"] Mar 17 02:45:00 crc kubenswrapper[4735]: E0317 02:45:00.182201 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="registry-server" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.182303 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="registry-server" Mar 17 02:45:00 crc kubenswrapper[4735]: E0317 02:45:00.182406 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287cfd38-e820-494c-9be5-a86fd369b3f8" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.182487 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="287cfd38-e820-494c-9be5-a86fd369b3f8" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4735]: E0317 02:45:00.182567 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="extract-content" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.182639 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="extract-content" Mar 17 02:45:00 crc kubenswrapper[4735]: E0317 02:45:00.182724 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="extract-utilities" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.182800 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="extract-utilities" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.183160 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2480f397-4fc1-4554-b99e-b552353ad396" containerName="registry-server" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.183265 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="287cfd38-e820-494c-9be5-a86fd369b3f8" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.184090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.186458 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.191281 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.196460 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5"] Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.326020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpd2\" (UniqueName: \"kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.326824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.327029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.429207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.429263 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.429375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpd2\" (UniqueName: \"kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.431582 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.445452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.449725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpd2\" (UniqueName: \"kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2\") pod \"collect-profiles-29561925-69ws5\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:00 crc kubenswrapper[4735]: I0317 02:45:00.505930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:01 crc kubenswrapper[4735]: I0317 02:45:01.008454 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5"] Mar 17 02:45:01 crc kubenswrapper[4735]: I0317 02:45:01.421510 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" event={"ID":"65a45e01-9170-4471-aa81-896799b0bb80","Type":"ContainerStarted","Data":"6b2417659dc00e4326c2c7b58569585e1fa96bec7ae9c7a26f7a5121b809086f"} Mar 17 02:45:01 crc kubenswrapper[4735]: I0317 02:45:01.422006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" event={"ID":"65a45e01-9170-4471-aa81-896799b0bb80","Type":"ContainerStarted","Data":"a43fdb6c10344477eee11710bd48831fc936bbe6a0236737316f0b532aa18109"} Mar 17 02:45:01 crc kubenswrapper[4735]: I0317 02:45:01.457200 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" podStartSLOduration=1.457173781 podStartE2EDuration="1.457173781s" podCreationTimestamp="2026-03-17 02:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:45:01.443616477 +0000 UTC m=+5727.075849465" watchObservedRunningTime="2026-03-17 02:45:01.457173781 +0000 UTC m=+5727.089406799" Mar 17 02:45:02 crc kubenswrapper[4735]: I0317 02:45:02.436227 4735 generic.go:334] "Generic (PLEG): container finished" podID="65a45e01-9170-4471-aa81-896799b0bb80" containerID="6b2417659dc00e4326c2c7b58569585e1fa96bec7ae9c7a26f7a5121b809086f" exitCode=0 Mar 17 02:45:02 crc kubenswrapper[4735]: I0317 02:45:02.436394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" event={"ID":"65a45e01-9170-4471-aa81-896799b0bb80","Type":"ContainerDied","Data":"6b2417659dc00e4326c2c7b58569585e1fa96bec7ae9c7a26f7a5121b809086f"} Mar 17 02:45:03 crc kubenswrapper[4735]: I0317 02:45:03.736543 4735 scope.go:117] "RemoveContainer" containerID="b9ff558c5e47780516e41e5ade25f3ab698d1bd5eec6358a33cc36f1e2493ff6" Mar 17 02:45:03 crc kubenswrapper[4735]: I0317 02:45:03.823101 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.005603 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctpd2\" (UniqueName: \"kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2\") pod \"65a45e01-9170-4471-aa81-896799b0bb80\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.005806 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume\") pod \"65a45e01-9170-4471-aa81-896799b0bb80\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.005930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume\") pod \"65a45e01-9170-4471-aa81-896799b0bb80\" (UID: \"65a45e01-9170-4471-aa81-896799b0bb80\") " Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.006914 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume" (OuterVolumeSpecName: "config-volume") pod "65a45e01-9170-4471-aa81-896799b0bb80" (UID: "65a45e01-9170-4471-aa81-896799b0bb80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.012200 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2" (OuterVolumeSpecName: "kube-api-access-ctpd2") pod "65a45e01-9170-4471-aa81-896799b0bb80" (UID: "65a45e01-9170-4471-aa81-896799b0bb80"). InnerVolumeSpecName "kube-api-access-ctpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.012522 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "65a45e01-9170-4471-aa81-896799b0bb80" (UID: "65a45e01-9170-4471-aa81-896799b0bb80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.108356 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctpd2\" (UniqueName: \"kubernetes.io/projected/65a45e01-9170-4471-aa81-896799b0bb80-kube-api-access-ctpd2\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.108749 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65a45e01-9170-4471-aa81-896799b0bb80-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.109241 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65a45e01-9170-4471-aa81-896799b0bb80-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.459250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" event={"ID":"65a45e01-9170-4471-aa81-896799b0bb80","Type":"ContainerDied","Data":"a43fdb6c10344477eee11710bd48831fc936bbe6a0236737316f0b532aa18109"} Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.459938 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a43fdb6c10344477eee11710bd48831fc936bbe6a0236737316f0b532aa18109" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.459365 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5" Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.549128 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl"] Mar 17 02:45:04 crc kubenswrapper[4735]: I0317 02:45:04.558036 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-mgvtl"] Mar 17 02:45:05 crc kubenswrapper[4735]: I0317 02:45:05.090101 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da359745-5f32-4749-8c55-e19847bae917" path="/var/lib/kubelet/pods/da359745-5f32-4749-8c55-e19847bae917/volumes" Mar 17 02:45:42 crc kubenswrapper[4735]: I0317 02:45:42.607157 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:45:42 crc kubenswrapper[4735]: I0317 02:45:42.608113 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:45:43 crc kubenswrapper[4735]: E0317 02:45:43.486225 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:47324->38.102.83.65:40841: write tcp 38.102.83.65:47324->38.102.83.65:40841: write: broken pipe Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.153557 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561926-tk7bj"] Mar 17 02:46:00 crc kubenswrapper[4735]: E0317 02:46:00.156054 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a45e01-9170-4471-aa81-896799b0bb80" containerName="collect-profiles" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.156160 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a45e01-9170-4471-aa81-896799b0bb80" containerName="collect-profiles" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.156561 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a45e01-9170-4471-aa81-896799b0bb80" containerName="collect-profiles" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.157461 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.159751 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.160245 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.164524 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-tk7bj"] Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.165717 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.329224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbwqv\" (UniqueName: \"kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv\") pod \"auto-csr-approver-29561926-tk7bj\" (UID: \"0e409084-9c00-49ac-b69a-626f23589504\") " pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.431192 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbwqv\" (UniqueName: \"kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv\") pod \"auto-csr-approver-29561926-tk7bj\" (UID: \"0e409084-9c00-49ac-b69a-626f23589504\") " pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.448199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbwqv\" (UniqueName: \"kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv\") pod \"auto-csr-approver-29561926-tk7bj\" (UID: \"0e409084-9c00-49ac-b69a-626f23589504\") " pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:00 crc kubenswrapper[4735]: I0317 02:46:00.481395 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:01 crc kubenswrapper[4735]: I0317 02:46:01.105807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-tk7bj"] Mar 17 02:46:02 crc kubenswrapper[4735]: I0317 02:46:02.062577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" event={"ID":"0e409084-9c00-49ac-b69a-626f23589504","Type":"ContainerStarted","Data":"9dce4f236e7e48d344cb4cbbff221a87fc7429bdf28a599bab31ba5f47266d1b"} Mar 17 02:46:03 crc kubenswrapper[4735]: I0317 02:46:03.104196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" event={"ID":"0e409084-9c00-49ac-b69a-626f23589504","Type":"ContainerStarted","Data":"513f9b740e9fb879ec5819d3edf6b1af46c48c158fa296bab303720c8e9ae028"} Mar 17 02:46:03 crc kubenswrapper[4735]: I0317 02:46:03.112105 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" podStartSLOduration=2.279900756 podStartE2EDuration="3.112084153s" podCreationTimestamp="2026-03-17 02:46:00 +0000 UTC" firstStartedPulling="2026-03-17 02:46:01.111094799 +0000 UTC m=+5786.743327777" lastFinishedPulling="2026-03-17 02:46:01.943278206 +0000 UTC m=+5787.575511174" observedRunningTime="2026-03-17 02:46:03.099312278 +0000 UTC m=+5788.731545266" watchObservedRunningTime="2026-03-17 02:46:03.112084153 +0000 UTC m=+5788.744317141" Mar 17 02:46:03 crc kubenswrapper[4735]: I0317 02:46:03.883705 4735 scope.go:117] "RemoveContainer" containerID="e8f88eb69d915cb8494a9dd8483ff75a20193bddf531ad3e1e82a63c3a06cd9d" Mar 17 02:46:04 crc kubenswrapper[4735]: I0317 02:46:04.090558 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e409084-9c00-49ac-b69a-626f23589504" containerID="513f9b740e9fb879ec5819d3edf6b1af46c48c158fa296bab303720c8e9ae028" exitCode=0 Mar 17 02:46:04 crc kubenswrapper[4735]: I0317 02:46:04.091140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" event={"ID":"0e409084-9c00-49ac-b69a-626f23589504","Type":"ContainerDied","Data":"513f9b740e9fb879ec5819d3edf6b1af46c48c158fa296bab303720c8e9ae028"} Mar 17 02:46:05 crc kubenswrapper[4735]: I0317 02:46:05.667958 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:05 crc kubenswrapper[4735]: I0317 02:46:05.841989 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbwqv\" (UniqueName: \"kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv\") pod \"0e409084-9c00-49ac-b69a-626f23589504\" (UID: \"0e409084-9c00-49ac-b69a-626f23589504\") " Mar 17 02:46:05 crc kubenswrapper[4735]: I0317 02:46:05.848158 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv" (OuterVolumeSpecName: "kube-api-access-wbwqv") pod "0e409084-9c00-49ac-b69a-626f23589504" (UID: "0e409084-9c00-49ac-b69a-626f23589504"). InnerVolumeSpecName "kube-api-access-wbwqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:46:05 crc kubenswrapper[4735]: I0317 02:46:05.944912 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbwqv\" (UniqueName: \"kubernetes.io/projected/0e409084-9c00-49ac-b69a-626f23589504-kube-api-access-wbwqv\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:06 crc kubenswrapper[4735]: I0317 02:46:06.128863 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" event={"ID":"0e409084-9c00-49ac-b69a-626f23589504","Type":"ContainerDied","Data":"9dce4f236e7e48d344cb4cbbff221a87fc7429bdf28a599bab31ba5f47266d1b"} Mar 17 02:46:06 crc kubenswrapper[4735]: I0317 02:46:06.129714 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dce4f236e7e48d344cb4cbbff221a87fc7429bdf28a599bab31ba5f47266d1b" Mar 17 02:46:06 crc kubenswrapper[4735]: I0317 02:46:06.129834 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-tk7bj" Mar 17 02:46:06 crc kubenswrapper[4735]: I0317 02:46:06.179529 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-qwnkp"] Mar 17 02:46:06 crc kubenswrapper[4735]: I0317 02:46:06.186556 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-qwnkp"] Mar 17 02:46:07 crc kubenswrapper[4735]: I0317 02:46:07.084017 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f526e-32a8-4e63-81d2-63b1a1670486" path="/var/lib/kubelet/pods/231f526e-32a8-4e63-81d2-63b1a1670486/volumes" Mar 17 02:46:12 crc kubenswrapper[4735]: I0317 02:46:12.606696 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:46:12 crc kubenswrapper[4735]: I0317 02:46:12.607429 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.519194 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:46:13 crc kubenswrapper[4735]: E0317 02:46:13.520004 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e409084-9c00-49ac-b69a-626f23589504" containerName="oc" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.520025 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e409084-9c00-49ac-b69a-626f23589504" containerName="oc" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.520271 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e409084-9c00-49ac-b69a-626f23589504" containerName="oc" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.521959 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.537657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.651057 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rppkv\" (UniqueName: \"kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.651114 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.651251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.752985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.753117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rppkv\" (UniqueName: \"kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.753158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.753528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.753588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.772638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rppkv\" (UniqueName: \"kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv\") pod \"redhat-operators-rzh4j\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:13 crc kubenswrapper[4735]: I0317 02:46:13.843971 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:14 crc kubenswrapper[4735]: I0317 02:46:14.306888 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:46:15 crc kubenswrapper[4735]: I0317 02:46:15.224573 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerID="f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977" exitCode=0 Mar 17 02:46:15 crc kubenswrapper[4735]: I0317 02:46:15.225090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerDied","Data":"f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977"} Mar 17 02:46:15 crc kubenswrapper[4735]: I0317 02:46:15.225123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerStarted","Data":"80d3bbaf73d62675ddafa76fcf33a551ebeef0a7607425e103b5de42b495066f"} Mar 17 02:46:16 crc kubenswrapper[4735]: I0317 02:46:16.233761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerStarted","Data":"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594"} Mar 17 02:46:20 crc kubenswrapper[4735]: I0317 02:46:20.281919 4735 generic.go:334] "Generic (PLEG): container finished" podID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" containerID="6154a9fb8d8bf213e9f3d93edb81056fcbcea600a0ca305807a507cd16047088" exitCode=0 Mar 17 02:46:20 crc kubenswrapper[4735]: I0317 02:46:20.282009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d","Type":"ContainerDied","Data":"6154a9fb8d8bf213e9f3d93edb81056fcbcea600a0ca305807a507cd16047088"} Mar 17 02:46:21 crc kubenswrapper[4735]: I0317 02:46:21.296114 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerID="bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594" exitCode=0 Mar 17 02:46:21 crc kubenswrapper[4735]: I0317 02:46:21.296247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerDied","Data":"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594"} Mar 17 02:46:21 crc kubenswrapper[4735]: I0317 02:46:21.299309 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.131019 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.192951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq9td\" (UniqueName: \"kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193451 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193493 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.193561 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data\") pod \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\" (UID: \"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d\") " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.198431 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.199736 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data" (OuterVolumeSpecName: "config-data") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.204285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td" (OuterVolumeSpecName: "kube-api-access-kq9td") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "kube-api-access-kq9td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.207162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.241668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.273579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.280122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.296117 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.296249 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.296439 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.296567 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.297500 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq9td\" (UniqueName: \"kubernetes.io/projected/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-kube-api-access-kq9td\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.297597 4735 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.297677 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.308282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.316369 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" (UID: "07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.321176 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Mar 17 02:46:22 crc kubenswrapper[4735]: E0317 02:46:22.321693 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.321726 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.322138 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.322775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerStarted","Data":"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33"} Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.326214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.329235 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.329477 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.332112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d","Type":"ContainerDied","Data":"afe0e91de4e645b00e7ef63aa7a2ca38af275e9da46c94a7ab795a3f429aef73"} Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.332147 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe0e91de4e645b00e7ef63aa7a2ca38af275e9da46c94a7ab795a3f429aef73" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.332196 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.336177 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.375485 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.389110 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzh4j" podStartSLOduration=2.843781635 podStartE2EDuration="9.389089998s" podCreationTimestamp="2026-03-17 02:46:13 +0000 UTC" firstStartedPulling="2026-03-17 02:46:15.231277539 +0000 UTC m=+5800.863510517" lastFinishedPulling="2026-03-17 02:46:21.776585892 +0000 UTC m=+5807.408818880" observedRunningTime="2026-03-17 02:46:22.347491513 +0000 UTC m=+5807.979724491" watchObservedRunningTime="2026-03-17 02:46:22.389089998 +0000 UTC m=+5808.021322976" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.399837 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzqn\" (UniqueName: \"kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.399912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.399965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400247 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400427 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.400438 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.401969 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.431981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501498 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzqn\" (UniqueName: \"kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.501756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.502034 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.502363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.504052 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.504388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.506268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.509310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.509465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.517356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzqn\" (UniqueName: \"kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:22 crc kubenswrapper[4735]: I0317 02:46:22.642774 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 02:46:23 crc kubenswrapper[4735]: I0317 02:46:23.010340 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Mar 17 02:46:23 crc kubenswrapper[4735]: W0317 02:46:23.015642 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb53fa15_c77f_4396_aaae_d0110e90ddb6.slice/crio-759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c WatchSource:0}: Error finding container 759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c: Status 404 returned error can't find the container with id 759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c Mar 17 02:46:23 crc kubenswrapper[4735]: I0317 02:46:23.344354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"db53fa15-c77f-4396-aaae-d0110e90ddb6","Type":"ContainerStarted","Data":"759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c"} Mar 17 02:46:23 crc kubenswrapper[4735]: I0317 02:46:23.845093 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:23 crc kubenswrapper[4735]: I0317 02:46:23.845156 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:46:24 crc kubenswrapper[4735]: I0317 02:46:24.903177 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzh4j" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:46:24 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:46:24 crc kubenswrapper[4735]: > Mar 17 02:46:26 crc kubenswrapper[4735]: I0317 02:46:26.375754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"db53fa15-c77f-4396-aaae-d0110e90ddb6","Type":"ContainerStarted","Data":"8eba0d557b8ff2882e72cfa8bb849be4bed9cc10ec840fb98f4015897c1e5c8d"} Mar 17 02:46:26 crc kubenswrapper[4735]: I0317 02:46:26.412623 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" podStartSLOduration=4.41260372 podStartE2EDuration="4.41260372s" podCreationTimestamp="2026-03-17 02:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:46:26.40546633 +0000 UTC m=+5812.037699308" watchObservedRunningTime="2026-03-17 02:46:26.41260372 +0000 UTC m=+5812.044836708" Mar 17 02:46:34 crc kubenswrapper[4735]: I0317 02:46:34.938298 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzh4j" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:46:34 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:46:34 crc kubenswrapper[4735]: > Mar 17 02:46:42 crc kubenswrapper[4735]: I0317 02:46:42.606083 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:46:42 crc kubenswrapper[4735]: I0317 02:46:42.606543 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:46:42 crc kubenswrapper[4735]: I0317 02:46:42.606590 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:46:42 crc kubenswrapper[4735]: I0317 02:46:42.607460 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:46:42 crc kubenswrapper[4735]: I0317 02:46:42.607516 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" gracePeriod=600 Mar 17 02:46:42 crc kubenswrapper[4735]: E0317 02:46:42.745295 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:46:43 crc kubenswrapper[4735]: I0317 02:46:43.544274 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" exitCode=0 Mar 17 02:46:43 crc kubenswrapper[4735]: I0317 02:46:43.544316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb"} Mar 17 02:46:43 crc kubenswrapper[4735]: I0317 02:46:43.544770 4735 scope.go:117] "RemoveContainer" containerID="433e4891d705f2bccbcec8f675b9b31a8819ec338db972b2047d30c3d221172b" Mar 17 02:46:43 crc kubenswrapper[4735]: I0317 02:46:43.545423 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:46:43 crc kubenswrapper[4735]: E0317 02:46:43.545718 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:46:44 crc kubenswrapper[4735]: I0317 02:46:44.957328 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzh4j" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:46:44 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:46:44 crc kubenswrapper[4735]: > Mar 17 02:46:54 crc kubenswrapper[4735]: I0317 02:46:54.914151 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzh4j" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" probeResult="failure" output=< Mar 17 02:46:54 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:46:54 crc kubenswrapper[4735]: > Mar 17 02:46:57 crc kubenswrapper[4735]: I0317 02:46:57.073124 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:46:57 crc kubenswrapper[4735]: E0317 02:46:57.074047 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:47:03 crc kubenswrapper[4735]: I0317 02:47:03.910035 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:47:03 crc kubenswrapper[4735]: I0317 02:47:03.969695 4735 scope.go:117] "RemoveContainer" containerID="a7f47cb080c15eebec91aaffee47f16487a3d4added267cd71deecae11e437bd" Mar 17 02:47:03 crc kubenswrapper[4735]: I0317 02:47:03.985446 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:47:04 crc kubenswrapper[4735]: I0317 02:47:04.172995 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:47:05 crc kubenswrapper[4735]: I0317 02:47:05.790928 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzh4j" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" containerID="cri-o://9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33" gracePeriod=2 Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.465626 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.562941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities\") pod \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.562986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rppkv\" (UniqueName: \"kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv\") pod \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.563102 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content\") pod \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\" (UID: \"a7ca7c51-30a6-4579-9a87-e3e14f59c18c\") " Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.563642 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities" (OuterVolumeSpecName: "utilities") pod "a7ca7c51-30a6-4579-9a87-e3e14f59c18c" (UID: "a7ca7c51-30a6-4579-9a87-e3e14f59c18c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.579018 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv" (OuterVolumeSpecName: "kube-api-access-rppkv") pod "a7ca7c51-30a6-4579-9a87-e3e14f59c18c" (UID: "a7ca7c51-30a6-4579-9a87-e3e14f59c18c"). InnerVolumeSpecName "kube-api-access-rppkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.666308 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.666349 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rppkv\" (UniqueName: \"kubernetes.io/projected/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-kube-api-access-rppkv\") on node \"crc\" DevicePath \"\"" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.722675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ca7c51-30a6-4579-9a87-e3e14f59c18c" (UID: "a7ca7c51-30a6-4579-9a87-e3e14f59c18c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.767675 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca7c51-30a6-4579-9a87-e3e14f59c18c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.799174 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerID="9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33" exitCode=0 Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.799222 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerDied","Data":"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33"} Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.799248 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzh4j" event={"ID":"a7ca7c51-30a6-4579-9a87-e3e14f59c18c","Type":"ContainerDied","Data":"80d3bbaf73d62675ddafa76fcf33a551ebeef0a7607425e103b5de42b495066f"} Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.799268 4735 scope.go:117] "RemoveContainer" containerID="9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.800430 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzh4j" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.832422 4735 scope.go:117] "RemoveContainer" containerID="bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.850068 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.863165 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzh4j"] Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.870080 4735 scope.go:117] "RemoveContainer" containerID="f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.921353 4735 scope.go:117] "RemoveContainer" containerID="9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33" Mar 17 02:47:06 crc kubenswrapper[4735]: E0317 02:47:06.921867 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33\": container with ID starting with 9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33 not found: ID does not exist" containerID="9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.921898 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33"} err="failed to get container status \"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33\": rpc error: code = NotFound desc = could not find container \"9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33\": container with ID starting with 9f3664c3f0052b1fcc32c9b398c035896e3a39c0e785d77bc589bb8b2c21ca33 not found: ID does not exist" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.921919 4735 scope.go:117] "RemoveContainer" containerID="bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594" Mar 17 02:47:06 crc kubenswrapper[4735]: E0317 02:47:06.922184 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594\": container with ID starting with bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594 not found: ID does not exist" containerID="bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.922216 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594"} err="failed to get container status \"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594\": rpc error: code = NotFound desc = could not find container \"bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594\": container with ID starting with bff68bd111040214c00ce0c9edbef81602643391e6459f6fe19e4ee136d58594 not found: ID does not exist" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.922235 4735 scope.go:117] "RemoveContainer" containerID="f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977" Mar 17 02:47:06 crc kubenswrapper[4735]: E0317 02:47:06.922559 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977\": container with ID starting with f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977 not found: ID does not exist" containerID="f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977" Mar 17 02:47:06 crc kubenswrapper[4735]: I0317 02:47:06.922610 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977"} err="failed to get container status \"f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977\": rpc error: code = NotFound desc = could not find container \"f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977\": container with ID starting with f5b11f5836ba29729c43e29885935e49a7aa2fed584f3f0d3d929b126cc2d977 not found: ID does not exist" Mar 17 02:47:07 crc kubenswrapper[4735]: I0317 02:47:07.092526 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" path="/var/lib/kubelet/pods/a7ca7c51-30a6-4579-9a87-e3e14f59c18c/volumes" Mar 17 02:47:11 crc kubenswrapper[4735]: I0317 02:47:11.076939 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:47:11 crc kubenswrapper[4735]: E0317 02:47:11.078017 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:47:24 crc kubenswrapper[4735]: I0317 02:47:24.074137 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:47:24 crc kubenswrapper[4735]: E0317 02:47:24.074929 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.435352 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 02:47:32 crc kubenswrapper[4735]: E0317 02:47:32.436166 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="extract-utilities" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.436178 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="extract-utilities" Mar 17 02:47:32 crc kubenswrapper[4735]: E0317 02:47:32.436203 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.436209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" Mar 17 02:47:32 crc kubenswrapper[4735]: E0317 02:47:32.436221 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="extract-content" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.436229 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="extract-content" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.436406 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ca7c51-30a6-4579-9a87-e3e14f59c18c" containerName="registry-server" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.438654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448646 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448692 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448709 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29ph\" (UniqueName: \"kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.448763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.451224 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29ph\" (UniqueName: \"kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.550717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.559744 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.561678 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.563091 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.566972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.567011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.568399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.574692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29ph\" (UniqueName: \"kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph\") pod \"neutron-d9c87887c-nzq2p\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:32 crc kubenswrapper[4735]: I0317 02:47:32.756954 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:33 crc kubenswrapper[4735]: I0317 02:47:33.336845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 02:47:34 crc kubenswrapper[4735]: I0317 02:47:34.100178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerStarted","Data":"f22530e71aeaeea7659cae58e7701b91cb653ec4667b28a75942ea9e75216442"} Mar 17 02:47:34 crc kubenswrapper[4735]: I0317 02:47:34.100578 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:47:34 crc kubenswrapper[4735]: I0317 02:47:34.100598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerStarted","Data":"d8ad7c5e0c4a91a6804e23bf06a1cf430c578a70e39fa2d6b7363239a4740ea6"} Mar 17 02:47:34 crc kubenswrapper[4735]: I0317 02:47:34.100635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerStarted","Data":"e0be22e99164a5922fc5f78a685d2f4c4a0922a2d0e1622e7b2d1938b2c5ee80"} Mar 17 02:47:34 crc kubenswrapper[4735]: I0317 02:47:34.127083 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d9c87887c-nzq2p" podStartSLOduration=2.127057632 podStartE2EDuration="2.127057632s" podCreationTimestamp="2026-03-17 02:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:47:34.122567245 +0000 UTC m=+5879.754800233" watchObservedRunningTime="2026-03-17 02:47:34.127057632 +0000 UTC m=+5879.759290610" Mar 17 02:47:36 crc kubenswrapper[4735]: I0317 02:47:36.073982 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:47:36 crc kubenswrapper[4735]: E0317 02:47:36.075031 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:47:47 crc kubenswrapper[4735]: I0317 02:47:47.072889 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:47:47 crc kubenswrapper[4735]: E0317 02:47:47.073483 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:47:59 crc kubenswrapper[4735]: I0317 02:47:59.073787 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:47:59 crc kubenswrapper[4735]: E0317 02:47:59.075065 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.177827 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561928-9smrv"] Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.181389 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.192368 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.192767 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.193073 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.201416 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561928-9smrv"] Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.243174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbd7\" (UniqueName: \"kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7\") pod \"auto-csr-approver-29561928-9smrv\" (UID: \"1c5561d2-f21e-4b59-9421-b23888d9f704\") " pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.345146 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbd7\" (UniqueName: \"kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7\") pod \"auto-csr-approver-29561928-9smrv\" (UID: \"1c5561d2-f21e-4b59-9421-b23888d9f704\") " pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.363929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbd7\" (UniqueName: \"kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7\") pod \"auto-csr-approver-29561928-9smrv\" (UID: \"1c5561d2-f21e-4b59-9421-b23888d9f704\") " pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:00 crc kubenswrapper[4735]: I0317 02:48:00.502366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:01 crc kubenswrapper[4735]: I0317 02:48:01.319844 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561928-9smrv"] Mar 17 02:48:01 crc kubenswrapper[4735]: I0317 02:48:01.416148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561928-9smrv" event={"ID":"1c5561d2-f21e-4b59-9421-b23888d9f704","Type":"ContainerStarted","Data":"b4f587ee713d8327ef223970086da249ce5320d99dcddfeb32da9367925d3d84"} Mar 17 02:48:02 crc kubenswrapper[4735]: I0317 02:48:02.778777 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 02:48:02 crc kubenswrapper[4735]: I0317 02:48:02.858334 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 02:48:02 crc kubenswrapper[4735]: I0317 02:48:02.859756 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bc47cb9c7-j5cjp" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-api" containerID="cri-o://918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74" gracePeriod=30 Mar 17 02:48:02 crc kubenswrapper[4735]: I0317 02:48:02.860153 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bc47cb9c7-j5cjp" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-httpd" containerID="cri-o://3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4" gracePeriod=30 Mar 17 02:48:03 crc kubenswrapper[4735]: I0317 02:48:03.441515 4735 generic.go:334] "Generic (PLEG): container finished" podID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerID="3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4" exitCode=0 Mar 17 02:48:03 crc kubenswrapper[4735]: I0317 02:48:03.441595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerDied","Data":"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4"} Mar 17 02:48:03 crc kubenswrapper[4735]: I0317 02:48:03.443761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561928-9smrv" event={"ID":"1c5561d2-f21e-4b59-9421-b23888d9f704","Type":"ContainerStarted","Data":"3309074f7babf6c8f752af96cbabb6977fc6db30089081c8657ac052e880ca08"} Mar 17 02:48:03 crc kubenswrapper[4735]: I0317 02:48:03.466182 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561928-9smrv" podStartSLOduration=2.447263632 podStartE2EDuration="3.466161464s" podCreationTimestamp="2026-03-17 02:48:00 +0000 UTC" firstStartedPulling="2026-03-17 02:48:01.31758714 +0000 UTC m=+5906.949820128" lastFinishedPulling="2026-03-17 02:48:02.336484982 +0000 UTC m=+5907.968717960" observedRunningTime="2026-03-17 02:48:03.459689259 +0000 UTC m=+5909.091922247" watchObservedRunningTime="2026-03-17 02:48:03.466161464 +0000 UTC m=+5909.098394452" Mar 17 02:48:04 crc kubenswrapper[4735]: I0317 02:48:04.454046 4735 generic.go:334] "Generic (PLEG): container finished" podID="1c5561d2-f21e-4b59-9421-b23888d9f704" containerID="3309074f7babf6c8f752af96cbabb6977fc6db30089081c8657ac052e880ca08" exitCode=0 Mar 17 02:48:04 crc kubenswrapper[4735]: I0317 02:48:04.454268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561928-9smrv" event={"ID":"1c5561d2-f21e-4b59-9421-b23888d9f704","Type":"ContainerDied","Data":"3309074f7babf6c8f752af96cbabb6977fc6db30089081c8657ac052e880ca08"} Mar 17 02:48:05 crc kubenswrapper[4735]: I0317 02:48:05.849701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:05 crc kubenswrapper[4735]: I0317 02:48:05.859379 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbd7\" (UniqueName: \"kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7\") pod \"1c5561d2-f21e-4b59-9421-b23888d9f704\" (UID: \"1c5561d2-f21e-4b59-9421-b23888d9f704\") " Mar 17 02:48:05 crc kubenswrapper[4735]: I0317 02:48:05.866765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7" (OuterVolumeSpecName: "kube-api-access-fmbd7") pod "1c5561d2-f21e-4b59-9421-b23888d9f704" (UID: "1c5561d2-f21e-4b59-9421-b23888d9f704"). InnerVolumeSpecName "kube-api-access-fmbd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:48:05 crc kubenswrapper[4735]: I0317 02:48:05.961804 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmbd7\" (UniqueName: \"kubernetes.io/projected/1c5561d2-f21e-4b59-9421-b23888d9f704-kube-api-access-fmbd7\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:06 crc kubenswrapper[4735]: I0317 02:48:06.475015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561928-9smrv" event={"ID":"1c5561d2-f21e-4b59-9421-b23888d9f704","Type":"ContainerDied","Data":"b4f587ee713d8327ef223970086da249ce5320d99dcddfeb32da9367925d3d84"} Mar 17 02:48:06 crc kubenswrapper[4735]: I0317 02:48:06.475063 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4f587ee713d8327ef223970086da249ce5320d99dcddfeb32da9367925d3d84" Mar 17 02:48:06 crc kubenswrapper[4735]: I0317 02:48:06.475109 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561928-9smrv" Mar 17 02:48:06 crc kubenswrapper[4735]: I0317 02:48:06.552510 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-b8gg4"] Mar 17 02:48:06 crc kubenswrapper[4735]: I0317 02:48:06.569974 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-b8gg4"] Mar 17 02:48:07 crc kubenswrapper[4735]: I0317 02:48:07.082873 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbaedfe4-e290-453e-9bf4-850197b1a1ca" path="/var/lib/kubelet/pods/dbaedfe4-e290-453e-9bf4-850197b1a1ca/volumes" Mar 17 02:48:09 crc kubenswrapper[4735]: I0317 02:48:09.378778 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7bc47cb9c7-j5cjp" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9696/\": dial tcp 10.217.0.175:9696: connect: connection refused" Mar 17 02:48:10 crc kubenswrapper[4735]: I0317 02:48:10.073611 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:48:10 crc kubenswrapper[4735]: E0317 02:48:10.073891 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.074357 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:48:23 crc kubenswrapper[4735]: E0317 02:48:23.075119 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.190410 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316026 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnrpw\" (UniqueName: \"kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316829 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.316949 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle\") pod \"03b7ea53-cedc-42d2-a218-183b2be7646c\" (UID: \"03b7ea53-cedc-42d2-a218-183b2be7646c\") " Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.325743 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.334645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw" (OuterVolumeSpecName: "kube-api-access-qnrpw") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "kube-api-access-qnrpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.372416 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.376846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.377606 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config" (OuterVolumeSpecName: "config") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.382276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.419959 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "03b7ea53-cedc-42d2-a218-183b2be7646c" (UID: "03b7ea53-cedc-42d2-a218-183b2be7646c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.420762 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.420951 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-config\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.421078 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnrpw\" (UniqueName: \"kubernetes.io/projected/03b7ea53-cedc-42d2-a218-183b2be7646c-kube-api-access-qnrpw\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.421135 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.421189 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.421245 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.523634 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b7ea53-cedc-42d2-a218-183b2be7646c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.628284 4735 generic.go:334] "Generic (PLEG): container finished" podID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerID="918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74" exitCode=0 Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.628329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerDied","Data":"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74"} Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.628359 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc47cb9c7-j5cjp" event={"ID":"03b7ea53-cedc-42d2-a218-183b2be7646c","Type":"ContainerDied","Data":"55c748cf9e62e7fe583b5f75489fd4df8724bae2c940cc295d3745363e953bfb"} Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.628379 4735 scope.go:117] "RemoveContainer" containerID="3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.628500 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc47cb9c7-j5cjp" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.670129 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.671194 4735 scope.go:117] "RemoveContainer" containerID="918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.677888 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bc47cb9c7-j5cjp"] Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.715095 4735 scope.go:117] "RemoveContainer" containerID="3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4" Mar 17 02:48:23 crc kubenswrapper[4735]: E0317 02:48:23.715628 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4\": container with ID starting with 3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4 not found: ID does not exist" containerID="3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.715679 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4"} err="failed to get container status \"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4\": rpc error: code = NotFound desc = could not find container \"3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4\": container with ID starting with 3d0d2eb18f143db592fad2b7f83a0b872dc30c266591db42517d6cd1d32a54f4 not found: ID does not exist" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.715714 4735 scope.go:117] "RemoveContainer" containerID="918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74" Mar 17 02:48:23 crc kubenswrapper[4735]: E0317 02:48:23.716661 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74\": container with ID starting with 918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74 not found: ID does not exist" containerID="918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74" Mar 17 02:48:23 crc kubenswrapper[4735]: I0317 02:48:23.716718 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74"} err="failed to get container status \"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74\": rpc error: code = NotFound desc = could not find container \"918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74\": container with ID starting with 918ce2ccb8877c2c5cd3a57bf6079f4ce347e4d8a08db7454ad8d9af91683b74 not found: ID does not exist" Mar 17 02:48:25 crc kubenswrapper[4735]: I0317 02:48:25.091665 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" path="/var/lib/kubelet/pods/03b7ea53-cedc-42d2-a218-183b2be7646c/volumes" Mar 17 02:48:38 crc kubenswrapper[4735]: I0317 02:48:38.073619 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:48:38 crc kubenswrapper[4735]: E0317 02:48:38.074278 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:48:53 crc kubenswrapper[4735]: I0317 02:48:53.073301 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:48:53 crc kubenswrapper[4735]: E0317 02:48:53.074415 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:48:56 crc kubenswrapper[4735]: E0317 02:48:56.630589 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:59136->38.102.83.65:40841: write tcp 38.102.83.65:59136->38.102.83.65:40841: write: broken pipe Mar 17 02:49:04 crc kubenswrapper[4735]: I0317 02:49:04.093133 4735 scope.go:117] "RemoveContainer" containerID="9f8839a50270ceb44e20ca92fe5d2d241dcdb3af88ff98734c963efa54f5d5bf" Mar 17 02:49:06 crc kubenswrapper[4735]: I0317 02:49:06.073485 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:49:06 crc kubenswrapper[4735]: E0317 02:49:06.074282 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:49:19 crc kubenswrapper[4735]: I0317 02:49:19.074231 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:49:19 crc kubenswrapper[4735]: E0317 02:49:19.075322 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:49:31 crc kubenswrapper[4735]: I0317 02:49:31.073910 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:49:31 crc kubenswrapper[4735]: E0317 02:49:31.076531 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:49:45 crc kubenswrapper[4735]: I0317 02:49:45.088407 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:49:45 crc kubenswrapper[4735]: E0317 02:49:45.089962 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:49:57 crc kubenswrapper[4735]: I0317 02:49:57.074614 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:49:57 crc kubenswrapper[4735]: E0317 02:49:57.075763 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.163396 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561930-ckrwl"] Mar 17 02:50:00 crc kubenswrapper[4735]: E0317 02:50:00.164230 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-httpd" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164246 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-httpd" Mar 17 02:50:00 crc kubenswrapper[4735]: E0317 02:50:00.164277 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-api" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164284 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-api" Mar 17 02:50:00 crc kubenswrapper[4735]: E0317 02:50:00.164302 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5561d2-f21e-4b59-9421-b23888d9f704" containerName="oc" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164312 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5561d2-f21e-4b59-9421-b23888d9f704" containerName="oc" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164518 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-httpd" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164536 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b7ea53-cedc-42d2-a218-183b2be7646c" containerName="neutron-api" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.164559 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5561d2-f21e-4b59-9421-b23888d9f704" containerName="oc" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.171613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.175600 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.180459 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.181079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.192731 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561930-ckrwl"] Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.347189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kcrt\" (UniqueName: \"kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt\") pod \"auto-csr-approver-29561930-ckrwl\" (UID: \"c225a571-1736-4315-ab73-932d1ce8e085\") " pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.449357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kcrt\" (UniqueName: \"kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt\") pod \"auto-csr-approver-29561930-ckrwl\" (UID: \"c225a571-1736-4315-ab73-932d1ce8e085\") " pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.478819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kcrt\" (UniqueName: \"kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt\") pod \"auto-csr-approver-29561930-ckrwl\" (UID: \"c225a571-1736-4315-ab73-932d1ce8e085\") " pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.505156 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:00 crc kubenswrapper[4735]: I0317 02:50:00.994289 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561930-ckrwl"] Mar 17 02:50:01 crc kubenswrapper[4735]: I0317 02:50:01.720177 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" event={"ID":"c225a571-1736-4315-ab73-932d1ce8e085","Type":"ContainerStarted","Data":"40a4b49b11ced4f6ae7a9a11c2e3d6978b1a5a90096abf8e886a782465398b46"} Mar 17 02:50:02 crc kubenswrapper[4735]: I0317 02:50:02.732187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" event={"ID":"c225a571-1736-4315-ab73-932d1ce8e085","Type":"ContainerStarted","Data":"c31380d19841b94c8ece4e8bd2d87dd9b59e22a2d746cd1e29f01cf5dddd03c5"} Mar 17 02:50:02 crc kubenswrapper[4735]: I0317 02:50:02.751467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" podStartSLOduration=1.423517379 podStartE2EDuration="2.75144933s" podCreationTimestamp="2026-03-17 02:50:00 +0000 UTC" firstStartedPulling="2026-03-17 02:50:01.00279294 +0000 UTC m=+6026.635025918" lastFinishedPulling="2026-03-17 02:50:02.330724861 +0000 UTC m=+6027.962957869" observedRunningTime="2026-03-17 02:50:02.748289785 +0000 UTC m=+6028.380522763" watchObservedRunningTime="2026-03-17 02:50:02.75144933 +0000 UTC m=+6028.383682308" Mar 17 02:50:03 crc kubenswrapper[4735]: I0317 02:50:03.748030 4735 generic.go:334] "Generic (PLEG): container finished" podID="c225a571-1736-4315-ab73-932d1ce8e085" containerID="c31380d19841b94c8ece4e8bd2d87dd9b59e22a2d746cd1e29f01cf5dddd03c5" exitCode=0 Mar 17 02:50:03 crc kubenswrapper[4735]: I0317 02:50:03.748122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" event={"ID":"c225a571-1736-4315-ab73-932d1ce8e085","Type":"ContainerDied","Data":"c31380d19841b94c8ece4e8bd2d87dd9b59e22a2d746cd1e29f01cf5dddd03c5"} Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.112089 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.249313 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kcrt\" (UniqueName: \"kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt\") pod \"c225a571-1736-4315-ab73-932d1ce8e085\" (UID: \"c225a571-1736-4315-ab73-932d1ce8e085\") " Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.257116 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt" (OuterVolumeSpecName: "kube-api-access-5kcrt") pod "c225a571-1736-4315-ab73-932d1ce8e085" (UID: "c225a571-1736-4315-ab73-932d1ce8e085"). InnerVolumeSpecName "kube-api-access-5kcrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.351474 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kcrt\" (UniqueName: \"kubernetes.io/projected/c225a571-1736-4315-ab73-932d1ce8e085-kube-api-access-5kcrt\") on node \"crc\" DevicePath \"\"" Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.772608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" event={"ID":"c225a571-1736-4315-ab73-932d1ce8e085","Type":"ContainerDied","Data":"40a4b49b11ced4f6ae7a9a11c2e3d6978b1a5a90096abf8e886a782465398b46"} Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.772669 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40a4b49b11ced4f6ae7a9a11c2e3d6978b1a5a90096abf8e886a782465398b46" Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.772703 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561930-ckrwl" Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.860605 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-z2wzc"] Mar 17 02:50:05 crc kubenswrapper[4735]: I0317 02:50:05.872898 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-z2wzc"] Mar 17 02:50:07 crc kubenswrapper[4735]: I0317 02:50:07.094150 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287cfd38-e820-494c-9be5-a86fd369b3f8" path="/var/lib/kubelet/pods/287cfd38-e820-494c-9be5-a86fd369b3f8/volumes" Mar 17 02:50:11 crc kubenswrapper[4735]: I0317 02:50:11.073216 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:50:11 crc kubenswrapper[4735]: E0317 02:50:11.073850 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:50:26 crc kubenswrapper[4735]: I0317 02:50:26.073179 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:50:26 crc kubenswrapper[4735]: E0317 02:50:26.075190 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:50:39 crc kubenswrapper[4735]: I0317 02:50:39.073372 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:50:39 crc kubenswrapper[4735]: E0317 02:50:39.074390 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:50:50 crc kubenswrapper[4735]: I0317 02:50:50.073960 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:50:50 crc kubenswrapper[4735]: E0317 02:50:50.075043 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:51:04 crc kubenswrapper[4735]: I0317 02:51:04.245913 4735 scope.go:117] "RemoveContainer" containerID="e0b57652dbcd18847a9b5678198cedc5e373ccf6d142fd50433753216d60f5b3" Mar 17 02:51:05 crc kubenswrapper[4735]: I0317 02:51:05.085492 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:51:05 crc kubenswrapper[4735]: E0317 02:51:05.086106 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:51:16 crc kubenswrapper[4735]: I0317 02:51:16.073494 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:51:16 crc kubenswrapper[4735]: E0317 02:51:16.075049 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:51:28 crc kubenswrapper[4735]: I0317 02:51:28.074683 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:51:28 crc kubenswrapper[4735]: E0317 02:51:28.075828 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:51:43 crc kubenswrapper[4735]: I0317 02:51:43.076180 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:51:43 crc kubenswrapper[4735]: I0317 02:51:43.975740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6"} Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.223674 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561932-hrzv2"] Mar 17 02:52:00 crc kubenswrapper[4735]: E0317 02:52:00.224680 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c225a571-1736-4315-ab73-932d1ce8e085" containerName="oc" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.224695 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c225a571-1736-4315-ab73-932d1ce8e085" containerName="oc" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.224935 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c225a571-1736-4315-ab73-932d1ce8e085" containerName="oc" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.225697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.228845 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.230525 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.230584 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.236125 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561932-hrzv2"] Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.334766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gvb\" (UniqueName: \"kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb\") pod \"auto-csr-approver-29561932-hrzv2\" (UID: \"f9554281-24fb-4dc3-b693-81e5b9199a04\") " pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.437210 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97gvb\" (UniqueName: \"kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb\") pod \"auto-csr-approver-29561932-hrzv2\" (UID: \"f9554281-24fb-4dc3-b693-81e5b9199a04\") " pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.462848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gvb\" (UniqueName: \"kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb\") pod \"auto-csr-approver-29561932-hrzv2\" (UID: \"f9554281-24fb-4dc3-b693-81e5b9199a04\") " pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:00 crc kubenswrapper[4735]: I0317 02:52:00.553051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:01 crc kubenswrapper[4735]: I0317 02:52:01.067224 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561932-hrzv2"] Mar 17 02:52:01 crc kubenswrapper[4735]: I0317 02:52:01.088432 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:52:01 crc kubenswrapper[4735]: I0317 02:52:01.154416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" event={"ID":"f9554281-24fb-4dc3-b693-81e5b9199a04","Type":"ContainerStarted","Data":"8c9f949a3deb75b69848693a0b4d421ce5d611b5d9c83e9ebfbbb0dd52fef720"} Mar 17 02:52:03 crc kubenswrapper[4735]: I0317 02:52:03.170698 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9554281-24fb-4dc3-b693-81e5b9199a04" containerID="e1f949776bfa58b3bf940c70e662a5a9a424f8770ab10c9e24d98fa7d21021dd" exitCode=0 Mar 17 02:52:03 crc kubenswrapper[4735]: I0317 02:52:03.171269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" event={"ID":"f9554281-24fb-4dc3-b693-81e5b9199a04","Type":"ContainerDied","Data":"e1f949776bfa58b3bf940c70e662a5a9a424f8770ab10c9e24d98fa7d21021dd"} Mar 17 02:52:04 crc kubenswrapper[4735]: I0317 02:52:04.559344 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:04 crc kubenswrapper[4735]: I0317 02:52:04.620435 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97gvb\" (UniqueName: \"kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb\") pod \"f9554281-24fb-4dc3-b693-81e5b9199a04\" (UID: \"f9554281-24fb-4dc3-b693-81e5b9199a04\") " Mar 17 02:52:04 crc kubenswrapper[4735]: I0317 02:52:04.633079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb" (OuterVolumeSpecName: "kube-api-access-97gvb") pod "f9554281-24fb-4dc3-b693-81e5b9199a04" (UID: "f9554281-24fb-4dc3-b693-81e5b9199a04"). InnerVolumeSpecName "kube-api-access-97gvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:52:04 crc kubenswrapper[4735]: I0317 02:52:04.723169 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97gvb\" (UniqueName: \"kubernetes.io/projected/f9554281-24fb-4dc3-b693-81e5b9199a04-kube-api-access-97gvb\") on node \"crc\" DevicePath \"\"" Mar 17 02:52:05 crc kubenswrapper[4735]: I0317 02:52:05.187526 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" event={"ID":"f9554281-24fb-4dc3-b693-81e5b9199a04","Type":"ContainerDied","Data":"8c9f949a3deb75b69848693a0b4d421ce5d611b5d9c83e9ebfbbb0dd52fef720"} Mar 17 02:52:05 crc kubenswrapper[4735]: I0317 02:52:05.187772 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9f949a3deb75b69848693a0b4d421ce5d611b5d9c83e9ebfbbb0dd52fef720" Mar 17 02:52:05 crc kubenswrapper[4735]: I0317 02:52:05.187901 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561932-hrzv2" Mar 17 02:52:05 crc kubenswrapper[4735]: I0317 02:52:05.623666 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-tk7bj"] Mar 17 02:52:05 crc kubenswrapper[4735]: I0317 02:52:05.630813 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-tk7bj"] Mar 17 02:52:07 crc kubenswrapper[4735]: I0317 02:52:07.089344 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e409084-9c00-49ac-b69a-626f23589504" path="/var/lib/kubelet/pods/0e409084-9c00-49ac-b69a-626f23589504/volumes" Mar 17 02:53:04 crc kubenswrapper[4735]: I0317 02:53:04.397215 4735 scope.go:117] "RemoveContainer" containerID="513f9b740e9fb879ec5819d3edf6b1af46c48c158fa296bab303720c8e9ae028" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.370407 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:23 crc kubenswrapper[4735]: E0317 02:53:23.377691 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9554281-24fb-4dc3-b693-81e5b9199a04" containerName="oc" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.377714 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9554281-24fb-4dc3-b693-81e5b9199a04" containerName="oc" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.377976 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9554281-24fb-4dc3-b693-81e5b9199a04" containerName="oc" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.379452 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.391219 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.396023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.396336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.396484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv66r\" (UniqueName: \"kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.527531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.527772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.527843 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv66r\" (UniqueName: \"kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.527969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.529775 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.558767 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv66r\" (UniqueName: \"kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r\") pod \"certified-operators-l6h9z\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:23 crc kubenswrapper[4735]: I0317 02:53:23.709175 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:24 crc kubenswrapper[4735]: I0317 02:53:24.170474 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:24 crc kubenswrapper[4735]: I0317 02:53:24.957007 4735 generic.go:334] "Generic (PLEG): container finished" podID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerID="b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0" exitCode=0 Mar 17 02:53:24 crc kubenswrapper[4735]: I0317 02:53:24.957098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerDied","Data":"b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0"} Mar 17 02:53:24 crc kubenswrapper[4735]: I0317 02:53:24.957354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerStarted","Data":"3f8e0c0d46c95e3cda912b8ab77db9fa149b22fafcf1b74ff5170dce46bd7f2f"} Mar 17 02:53:25 crc kubenswrapper[4735]: I0317 02:53:25.968396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerStarted","Data":"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a"} Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.366550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.368686 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.381996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.488387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.488523 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.488569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmg6\" (UniqueName: \"kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.590281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.590366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmg6\" (UniqueName: \"kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.590560 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.591004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.591101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.619241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmg6\" (UniqueName: \"kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6\") pod \"redhat-marketplace-wgqzd\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:26 crc kubenswrapper[4735]: I0317 02:53:26.687706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.240655 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:27 crc kubenswrapper[4735]: W0317 02:53:27.253277 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2ea802_500c_4ae2_bcd0_b5c0647780b6.slice/crio-73d95965b946aa9c0675dc75cc4de884f4e42aa24686cdfeb44369fa7b31e8e2 WatchSource:0}: Error finding container 73d95965b946aa9c0675dc75cc4de884f4e42aa24686cdfeb44369fa7b31e8e2: Status 404 returned error can't find the container with id 73d95965b946aa9c0675dc75cc4de884f4e42aa24686cdfeb44369fa7b31e8e2 Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.990213 4735 generic.go:334] "Generic (PLEG): container finished" podID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerID="6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a" exitCode=0 Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.990344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerDied","Data":"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a"} Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.994517 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerID="f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd" exitCode=0 Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.994580 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerDied","Data":"f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd"} Mar 17 02:53:27 crc kubenswrapper[4735]: I0317 02:53:27.996069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerStarted","Data":"73d95965b946aa9c0675dc75cc4de884f4e42aa24686cdfeb44369fa7b31e8e2"} Mar 17 02:53:29 crc kubenswrapper[4735]: I0317 02:53:29.006687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerStarted","Data":"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5"} Mar 17 02:53:29 crc kubenswrapper[4735]: I0317 02:53:29.008256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerStarted","Data":"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a"} Mar 17 02:53:29 crc kubenswrapper[4735]: I0317 02:53:29.042183 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6h9z" podStartSLOduration=2.562075052 podStartE2EDuration="6.042161323s" podCreationTimestamp="2026-03-17 02:53:23 +0000 UTC" firstStartedPulling="2026-03-17 02:53:24.963978832 +0000 UTC m=+6230.596211850" lastFinishedPulling="2026-03-17 02:53:28.444065143 +0000 UTC m=+6234.076298121" observedRunningTime="2026-03-17 02:53:29.02945339 +0000 UTC m=+6234.661686408" watchObservedRunningTime="2026-03-17 02:53:29.042161323 +0000 UTC m=+6234.674394311" Mar 17 02:53:30 crc kubenswrapper[4735]: I0317 02:53:30.018688 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerID="f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a" exitCode=0 Mar 17 02:53:30 crc kubenswrapper[4735]: I0317 02:53:30.018790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerDied","Data":"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a"} Mar 17 02:53:31 crc kubenswrapper[4735]: I0317 02:53:31.030482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerStarted","Data":"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6"} Mar 17 02:53:31 crc kubenswrapper[4735]: I0317 02:53:31.047655 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wgqzd" podStartSLOduration=2.605176245 podStartE2EDuration="5.047635846s" podCreationTimestamp="2026-03-17 02:53:26 +0000 UTC" firstStartedPulling="2026-03-17 02:53:27.997672109 +0000 UTC m=+6233.629905097" lastFinishedPulling="2026-03-17 02:53:30.44013172 +0000 UTC m=+6236.072364698" observedRunningTime="2026-03-17 02:53:31.046291354 +0000 UTC m=+6236.678524332" watchObservedRunningTime="2026-03-17 02:53:31.047635846 +0000 UTC m=+6236.679868824" Mar 17 02:53:33 crc kubenswrapper[4735]: I0317 02:53:33.709670 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:33 crc kubenswrapper[4735]: I0317 02:53:33.710015 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:34 crc kubenswrapper[4735]: I0317 02:53:34.791320 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-l6h9z" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="registry-server" probeResult="failure" output=< Mar 17 02:53:34 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:53:34 crc kubenswrapper[4735]: > Mar 17 02:53:36 crc kubenswrapper[4735]: I0317 02:53:36.688389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:36 crc kubenswrapper[4735]: I0317 02:53:36.688740 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:36 crc kubenswrapper[4735]: I0317 02:53:36.784199 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:37 crc kubenswrapper[4735]: I0317 02:53:37.142512 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:37 crc kubenswrapper[4735]: I0317 02:53:37.192302 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.096063 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wgqzd" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="registry-server" containerID="cri-o://261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6" gracePeriod=2 Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.653800 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.762594 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmg6\" (UniqueName: \"kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6\") pod \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.762743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities\") pod \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.762930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content\") pod \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\" (UID: \"3b2ea802-500c-4ae2-bcd0-b5c0647780b6\") " Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.764201 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities" (OuterVolumeSpecName: "utilities") pod "3b2ea802-500c-4ae2-bcd0-b5c0647780b6" (UID: "3b2ea802-500c-4ae2-bcd0-b5c0647780b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.774099 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6" (OuterVolumeSpecName: "kube-api-access-ccmg6") pod "3b2ea802-500c-4ae2-bcd0-b5c0647780b6" (UID: "3b2ea802-500c-4ae2-bcd0-b5c0647780b6"). InnerVolumeSpecName "kube-api-access-ccmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.798845 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b2ea802-500c-4ae2-bcd0-b5c0647780b6" (UID: "3b2ea802-500c-4ae2-bcd0-b5c0647780b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.866106 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccmg6\" (UniqueName: \"kubernetes.io/projected/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-kube-api-access-ccmg6\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.866182 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:39 crc kubenswrapper[4735]: I0317 02:53:39.866201 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea802-500c-4ae2-bcd0-b5c0647780b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.109795 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerID="261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6" exitCode=0 Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.109842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerDied","Data":"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6"} Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.109890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqzd" event={"ID":"3b2ea802-500c-4ae2-bcd0-b5c0647780b6","Type":"ContainerDied","Data":"73d95965b946aa9c0675dc75cc4de884f4e42aa24686cdfeb44369fa7b31e8e2"} Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.109913 4735 scope.go:117] "RemoveContainer" containerID="261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.109947 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqzd" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.140871 4735 scope.go:117] "RemoveContainer" containerID="f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.167142 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.168846 4735 scope.go:117] "RemoveContainer" containerID="f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.179172 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqzd"] Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.223844 4735 scope.go:117] "RemoveContainer" containerID="261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6" Mar 17 02:53:40 crc kubenswrapper[4735]: E0317 02:53:40.225476 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6\": container with ID starting with 261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6 not found: ID does not exist" containerID="261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.225529 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6"} err="failed to get container status \"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6\": rpc error: code = NotFound desc = could not find container \"261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6\": container with ID starting with 261506fe03c40c073aafb4fd016ad4520e30c7a885f9d2887b1cb220ae4c8df6 not found: ID does not exist" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.225557 4735 scope.go:117] "RemoveContainer" containerID="f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a" Mar 17 02:53:40 crc kubenswrapper[4735]: E0317 02:53:40.226083 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a\": container with ID starting with f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a not found: ID does not exist" containerID="f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.226119 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a"} err="failed to get container status \"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a\": rpc error: code = NotFound desc = could not find container \"f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a\": container with ID starting with f8f6294a660f8401a2a06d8f5266392a7d3e67dd5ef9bf0c7bae9e0f1547671a not found: ID does not exist" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.226139 4735 scope.go:117] "RemoveContainer" containerID="f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd" Mar 17 02:53:40 crc kubenswrapper[4735]: E0317 02:53:40.228122 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd\": container with ID starting with f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd not found: ID does not exist" containerID="f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd" Mar 17 02:53:40 crc kubenswrapper[4735]: I0317 02:53:40.228154 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd"} err="failed to get container status \"f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd\": rpc error: code = NotFound desc = could not find container \"f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd\": container with ID starting with f61a72a26f50ee3d0dade35bdd8d1e4b6e640c2c33cb3ada8307ad42d0f8adbd not found: ID does not exist" Mar 17 02:53:41 crc kubenswrapper[4735]: I0317 02:53:41.092393 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" path="/var/lib/kubelet/pods/3b2ea802-500c-4ae2-bcd0-b5c0647780b6/volumes" Mar 17 02:53:43 crc kubenswrapper[4735]: I0317 02:53:43.783894 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:43 crc kubenswrapper[4735]: I0317 02:53:43.845421 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:44 crc kubenswrapper[4735]: I0317 02:53:44.032152 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.161825 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6h9z" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="registry-server" containerID="cri-o://46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5" gracePeriod=2 Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.637442 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.813347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content\") pod \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.813460 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities\") pod \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.813659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv66r\" (UniqueName: \"kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r\") pod \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\" (UID: \"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24\") " Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.814840 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities" (OuterVolumeSpecName: "utilities") pod "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" (UID: "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.820125 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r" (OuterVolumeSpecName: "kube-api-access-gv66r") pod "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" (UID: "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24"). InnerVolumeSpecName "kube-api-access-gv66r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.866013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" (UID: "c8d707d6-ed7a-4020-a8f5-0cbb95b37f24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.915767 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv66r\" (UniqueName: \"kubernetes.io/projected/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-kube-api-access-gv66r\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.916055 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:45 crc kubenswrapper[4735]: I0317 02:53:45.916143 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.176423 4735 generic.go:334] "Generic (PLEG): container finished" podID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerID="46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5" exitCode=0 Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.176492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerDied","Data":"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5"} Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.176528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6h9z" event={"ID":"c8d707d6-ed7a-4020-a8f5-0cbb95b37f24","Type":"ContainerDied","Data":"3f8e0c0d46c95e3cda912b8ab77db9fa149b22fafcf1b74ff5170dce46bd7f2f"} Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.176551 4735 scope.go:117] "RemoveContainer" containerID="46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.176925 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6h9z" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.227143 4735 scope.go:117] "RemoveContainer" containerID="6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.232936 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.242230 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6h9z"] Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.276190 4735 scope.go:117] "RemoveContainer" containerID="b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.307848 4735 scope.go:117] "RemoveContainer" containerID="46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5" Mar 17 02:53:46 crc kubenswrapper[4735]: E0317 02:53:46.308442 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5\": container with ID starting with 46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5 not found: ID does not exist" containerID="46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.308505 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5"} err="failed to get container status \"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5\": rpc error: code = NotFound desc = could not find container \"46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5\": container with ID starting with 46d47fbf73dd08f9b8936e6ed82f8ada6bc4362b6d600e22520f400edfc8baf5 not found: ID does not exist" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.308544 4735 scope.go:117] "RemoveContainer" containerID="6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a" Mar 17 02:53:46 crc kubenswrapper[4735]: E0317 02:53:46.309264 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a\": container with ID starting with 6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a not found: ID does not exist" containerID="6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.309318 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a"} err="failed to get container status \"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a\": rpc error: code = NotFound desc = could not find container \"6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a\": container with ID starting with 6697a12a25d371051445fc4ea4daaed7639e11e61c21d1085dd138d95bab314a not found: ID does not exist" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.309374 4735 scope.go:117] "RemoveContainer" containerID="b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0" Mar 17 02:53:46 crc kubenswrapper[4735]: E0317 02:53:46.309695 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0\": container with ID starting with b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0 not found: ID does not exist" containerID="b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0" Mar 17 02:53:46 crc kubenswrapper[4735]: I0317 02:53:46.309740 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0"} err="failed to get container status \"b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0\": rpc error: code = NotFound desc = could not find container \"b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0\": container with ID starting with b2863b361845fb3a29eb4fa86dbd4cdb20104b4558c11c15c17a977512ffcad0 not found: ID does not exist" Mar 17 02:53:47 crc kubenswrapper[4735]: I0317 02:53:47.090255 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" path="/var/lib/kubelet/pods/c8d707d6-ed7a-4020-a8f5-0cbb95b37f24/volumes" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.195448 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561934-7hm7j"] Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197132 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="extract-utilities" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197164 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="extract-utilities" Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197200 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="extract-content" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="extract-content" Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197225 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197235 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197260 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="extract-utilities" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197268 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="extract-utilities" Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197294 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="extract-content" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197313 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="extract-content" Mar 17 02:54:00 crc kubenswrapper[4735]: E0317 02:54:00.197343 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197352 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197850 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d707d6-ed7a-4020-a8f5-0cbb95b37f24" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.197927 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2ea802-500c-4ae2-bcd0-b5c0647780b6" containerName="registry-server" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.202304 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.205178 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.205407 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.205579 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.220608 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561934-7hm7j"] Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.344259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswn8\" (UniqueName: \"kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8\") pod \"auto-csr-approver-29561934-7hm7j\" (UID: \"9a362e14-7198-4129-a8a7-99b0ae1f45eb\") " pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.446238 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswn8\" (UniqueName: \"kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8\") pod \"auto-csr-approver-29561934-7hm7j\" (UID: \"9a362e14-7198-4129-a8a7-99b0ae1f45eb\") " pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.468918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswn8\" (UniqueName: \"kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8\") pod \"auto-csr-approver-29561934-7hm7j\" (UID: \"9a362e14-7198-4129-a8a7-99b0ae1f45eb\") " pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:00 crc kubenswrapper[4735]: I0317 02:54:00.525852 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:01 crc kubenswrapper[4735]: I0317 02:54:01.058385 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561934-7hm7j"] Mar 17 02:54:01 crc kubenswrapper[4735]: W0317 02:54:01.067444 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a362e14_7198_4129_a8a7_99b0ae1f45eb.slice/crio-ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b WatchSource:0}: Error finding container ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b: Status 404 returned error can't find the container with id ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b Mar 17 02:54:01 crc kubenswrapper[4735]: I0317 02:54:01.352930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" event={"ID":"9a362e14-7198-4129-a8a7-99b0ae1f45eb","Type":"ContainerStarted","Data":"ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b"} Mar 17 02:54:02 crc kubenswrapper[4735]: I0317 02:54:02.371067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" event={"ID":"9a362e14-7198-4129-a8a7-99b0ae1f45eb","Type":"ContainerStarted","Data":"24d9baa8b7d2213be9e054dd025798a89fa5b7e6ddddb725343075df14647158"} Mar 17 02:54:02 crc kubenswrapper[4735]: I0317 02:54:02.395828 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" podStartSLOduration=1.530582548 podStartE2EDuration="2.395809226s" podCreationTimestamp="2026-03-17 02:54:00 +0000 UTC" firstStartedPulling="2026-03-17 02:54:01.068917629 +0000 UTC m=+6266.701150607" lastFinishedPulling="2026-03-17 02:54:01.934144267 +0000 UTC m=+6267.566377285" observedRunningTime="2026-03-17 02:54:02.389931806 +0000 UTC m=+6268.022164804" watchObservedRunningTime="2026-03-17 02:54:02.395809226 +0000 UTC m=+6268.028042224" Mar 17 02:54:03 crc kubenswrapper[4735]: I0317 02:54:03.381321 4735 generic.go:334] "Generic (PLEG): container finished" podID="9a362e14-7198-4129-a8a7-99b0ae1f45eb" containerID="24d9baa8b7d2213be9e054dd025798a89fa5b7e6ddddb725343075df14647158" exitCode=0 Mar 17 02:54:03 crc kubenswrapper[4735]: I0317 02:54:03.381399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" event={"ID":"9a362e14-7198-4129-a8a7-99b0ae1f45eb","Type":"ContainerDied","Data":"24d9baa8b7d2213be9e054dd025798a89fa5b7e6ddddb725343075df14647158"} Mar 17 02:54:04 crc kubenswrapper[4735]: I0317 02:54:04.824167 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:04 crc kubenswrapper[4735]: I0317 02:54:04.965961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fswn8\" (UniqueName: \"kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8\") pod \"9a362e14-7198-4129-a8a7-99b0ae1f45eb\" (UID: \"9a362e14-7198-4129-a8a7-99b0ae1f45eb\") " Mar 17 02:54:04 crc kubenswrapper[4735]: I0317 02:54:04.976242 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8" (OuterVolumeSpecName: "kube-api-access-fswn8") pod "9a362e14-7198-4129-a8a7-99b0ae1f45eb" (UID: "9a362e14-7198-4129-a8a7-99b0ae1f45eb"). InnerVolumeSpecName "kube-api-access-fswn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.067822 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fswn8\" (UniqueName: \"kubernetes.io/projected/9a362e14-7198-4129-a8a7-99b0ae1f45eb-kube-api-access-fswn8\") on node \"crc\" DevicePath \"\"" Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.415191 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" event={"ID":"9a362e14-7198-4129-a8a7-99b0ae1f45eb","Type":"ContainerDied","Data":"ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b"} Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.415523 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae4f8905e3c8632756387cd872e1139f4fdf8a0ca6c1f98a495a1c9cf7fcd14b" Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.415305 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561934-7hm7j" Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.489555 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561928-9smrv"] Mar 17 02:54:05 crc kubenswrapper[4735]: I0317 02:54:05.501616 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561928-9smrv"] Mar 17 02:54:07 crc kubenswrapper[4735]: I0317 02:54:07.085339 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5561d2-f21e-4b59-9421-b23888d9f704" path="/var/lib/kubelet/pods/1c5561d2-f21e-4b59-9421-b23888d9f704/volumes" Mar 17 02:54:12 crc kubenswrapper[4735]: I0317 02:54:12.606670 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:54:12 crc kubenswrapper[4735]: I0317 02:54:12.607122 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:54:42 crc kubenswrapper[4735]: I0317 02:54:42.606216 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:54:42 crc kubenswrapper[4735]: I0317 02:54:42.606900 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:55:04 crc kubenswrapper[4735]: I0317 02:55:04.611360 4735 scope.go:117] "RemoveContainer" containerID="3309074f7babf6c8f752af96cbabb6977fc6db30089081c8657ac052e880ca08" Mar 17 02:55:12 crc kubenswrapper[4735]: I0317 02:55:12.606428 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:55:12 crc kubenswrapper[4735]: I0317 02:55:12.607230 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:55:12 crc kubenswrapper[4735]: I0317 02:55:12.607298 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:55:12 crc kubenswrapper[4735]: I0317 02:55:12.608482 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:55:12 crc kubenswrapper[4735]: I0317 02:55:12.608570 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6" gracePeriod=600 Mar 17 02:55:13 crc kubenswrapper[4735]: I0317 02:55:13.165046 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6" exitCode=0 Mar 17 02:55:13 crc kubenswrapper[4735]: I0317 02:55:13.165125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6"} Mar 17 02:55:13 crc kubenswrapper[4735]: I0317 02:55:13.165435 4735 scope.go:117] "RemoveContainer" containerID="cc66c2f2cb818b5d4294be402323cde5912a7e2daf26bb765a8c54a7e33af7bb" Mar 17 02:55:14 crc kubenswrapper[4735]: I0317 02:55:14.175895 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891"} Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.160746 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561936-cshbt"] Mar 17 02:56:00 crc kubenswrapper[4735]: E0317 02:56:00.161736 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a362e14-7198-4129-a8a7-99b0ae1f45eb" containerName="oc" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.161750 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a362e14-7198-4129-a8a7-99b0ae1f45eb" containerName="oc" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.161977 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a362e14-7198-4129-a8a7-99b0ae1f45eb" containerName="oc" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.162638 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.166216 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.166304 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.166572 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.173037 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561936-cshbt"] Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.250637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtv8x\" (UniqueName: \"kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x\") pod \"auto-csr-approver-29561936-cshbt\" (UID: \"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06\") " pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.353000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtv8x\" (UniqueName: \"kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x\") pod \"auto-csr-approver-29561936-cshbt\" (UID: \"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06\") " pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.380710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtv8x\" (UniqueName: \"kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x\") pod \"auto-csr-approver-29561936-cshbt\" (UID: \"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06\") " pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.487302 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:00 crc kubenswrapper[4735]: I0317 02:56:00.977399 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561936-cshbt"] Mar 17 02:56:01 crc kubenswrapper[4735]: I0317 02:56:01.730513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561936-cshbt" event={"ID":"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06","Type":"ContainerStarted","Data":"d5e037491106d023b71c61a4354431e28e6add021faf8ab26b23ba6c8278613c"} Mar 17 02:56:02 crc kubenswrapper[4735]: I0317 02:56:02.744908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561936-cshbt" event={"ID":"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06","Type":"ContainerStarted","Data":"330c6d7ffab67d0451139e6709bb407354a5b8d40450934bc8ffd8d129c66ae4"} Mar 17 02:56:02 crc kubenswrapper[4735]: I0317 02:56:02.760524 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561936-cshbt" podStartSLOduration=1.650331275 podStartE2EDuration="2.76050746s" podCreationTimestamp="2026-03-17 02:56:00 +0000 UTC" firstStartedPulling="2026-03-17 02:56:00.993205303 +0000 UTC m=+6386.625438271" lastFinishedPulling="2026-03-17 02:56:02.103381478 +0000 UTC m=+6387.735614456" observedRunningTime="2026-03-17 02:56:02.759560817 +0000 UTC m=+6388.391793795" watchObservedRunningTime="2026-03-17 02:56:02.76050746 +0000 UTC m=+6388.392740448" Mar 17 02:56:03 crc kubenswrapper[4735]: I0317 02:56:03.758125 4735 generic.go:334] "Generic (PLEG): container finished" podID="4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" containerID="330c6d7ffab67d0451139e6709bb407354a5b8d40450934bc8ffd8d129c66ae4" exitCode=0 Mar 17 02:56:03 crc kubenswrapper[4735]: I0317 02:56:03.758625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561936-cshbt" event={"ID":"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06","Type":"ContainerDied","Data":"330c6d7ffab67d0451139e6709bb407354a5b8d40450934bc8ffd8d129c66ae4"} Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.186276 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.257873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtv8x\" (UniqueName: \"kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x\") pod \"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06\" (UID: \"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06\") " Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.267409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x" (OuterVolumeSpecName: "kube-api-access-wtv8x") pod "4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" (UID: "4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06"). InnerVolumeSpecName "kube-api-access-wtv8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.360647 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtv8x\" (UniqueName: \"kubernetes.io/projected/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06-kube-api-access-wtv8x\") on node \"crc\" DevicePath \"\"" Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.779671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561936-cshbt" event={"ID":"4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06","Type":"ContainerDied","Data":"d5e037491106d023b71c61a4354431e28e6add021faf8ab26b23ba6c8278613c"} Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.779723 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5e037491106d023b71c61a4354431e28e6add021faf8ab26b23ba6c8278613c" Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.779814 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561936-cshbt" Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.854338 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561930-ckrwl"] Mar 17 02:56:05 crc kubenswrapper[4735]: I0317 02:56:05.867757 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561930-ckrwl"] Mar 17 02:56:07 crc kubenswrapper[4735]: I0317 02:56:07.089214 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c225a571-1736-4315-ab73-932d1ce8e085" path="/var/lib/kubelet/pods/c225a571-1736-4315-ab73-932d1ce8e085/volumes" Mar 17 02:57:04 crc kubenswrapper[4735]: I0317 02:57:04.747000 4735 scope.go:117] "RemoveContainer" containerID="c31380d19841b94c8ece4e8bd2d87dd9b59e22a2d746cd1e29f01cf5dddd03c5" Mar 17 02:57:42 crc kubenswrapper[4735]: I0317 02:57:42.606084 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:57:42 crc kubenswrapper[4735]: I0317 02:57:42.606938 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.160319 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561938-22zck"] Mar 17 02:58:00 crc kubenswrapper[4735]: E0317 02:58:00.162292 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" containerName="oc" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.162401 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" containerName="oc" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.162735 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" containerName="oc" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.163471 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.166606 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.167563 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.167711 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.174085 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561938-22zck"] Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.265342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j7t\" (UniqueName: \"kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t\") pod \"auto-csr-approver-29561938-22zck\" (UID: \"79373054-b0ae-47ba-a0f4-eb65a20d565b\") " pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.367599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j7t\" (UniqueName: \"kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t\") pod \"auto-csr-approver-29561938-22zck\" (UID: \"79373054-b0ae-47ba-a0f4-eb65a20d565b\") " pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.387002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j7t\" (UniqueName: \"kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t\") pod \"auto-csr-approver-29561938-22zck\" (UID: \"79373054-b0ae-47ba-a0f4-eb65a20d565b\") " pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.479880 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.988143 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561938-22zck"] Mar 17 02:58:00 crc kubenswrapper[4735]: I0317 02:58:00.996126 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:58:01 crc kubenswrapper[4735]: I0317 02:58:01.394025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561938-22zck" event={"ID":"79373054-b0ae-47ba-a0f4-eb65a20d565b","Type":"ContainerStarted","Data":"9b93fbd248a32309e729b97f188f83fbcf046a7e6e9a755531ff2d7fb94a0922"} Mar 17 02:58:02 crc kubenswrapper[4735]: I0317 02:58:02.404515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561938-22zck" event={"ID":"79373054-b0ae-47ba-a0f4-eb65a20d565b","Type":"ContainerStarted","Data":"32d77d9358c08cafc9f37a38894aa4dbb1094e36718900c2dc08dcdb2427e903"} Mar 17 02:58:02 crc kubenswrapper[4735]: I0317 02:58:02.428073 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561938-22zck" podStartSLOduration=1.389004148 podStartE2EDuration="2.428055603s" podCreationTimestamp="2026-03-17 02:58:00 +0000 UTC" firstStartedPulling="2026-03-17 02:58:00.99591674 +0000 UTC m=+6506.628149718" lastFinishedPulling="2026-03-17 02:58:02.034968185 +0000 UTC m=+6507.667201173" observedRunningTime="2026-03-17 02:58:02.419798066 +0000 UTC m=+6508.052031054" watchObservedRunningTime="2026-03-17 02:58:02.428055603 +0000 UTC m=+6508.060288581" Mar 17 02:58:03 crc kubenswrapper[4735]: I0317 02:58:03.434393 4735 generic.go:334] "Generic (PLEG): container finished" podID="79373054-b0ae-47ba-a0f4-eb65a20d565b" containerID="32d77d9358c08cafc9f37a38894aa4dbb1094e36718900c2dc08dcdb2427e903" exitCode=0 Mar 17 02:58:03 crc kubenswrapper[4735]: I0317 02:58:03.434453 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561938-22zck" event={"ID":"79373054-b0ae-47ba-a0f4-eb65a20d565b","Type":"ContainerDied","Data":"32d77d9358c08cafc9f37a38894aa4dbb1094e36718900c2dc08dcdb2427e903"} Mar 17 02:58:04 crc kubenswrapper[4735]: I0317 02:58:04.918599 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.052098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2j7t\" (UniqueName: \"kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t\") pod \"79373054-b0ae-47ba-a0f4-eb65a20d565b\" (UID: \"79373054-b0ae-47ba-a0f4-eb65a20d565b\") " Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.059678 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t" (OuterVolumeSpecName: "kube-api-access-d2j7t") pod "79373054-b0ae-47ba-a0f4-eb65a20d565b" (UID: "79373054-b0ae-47ba-a0f4-eb65a20d565b"). InnerVolumeSpecName "kube-api-access-d2j7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.154487 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2j7t\" (UniqueName: \"kubernetes.io/projected/79373054-b0ae-47ba-a0f4-eb65a20d565b-kube-api-access-d2j7t\") on node \"crc\" DevicePath \"\"" Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.461450 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561938-22zck" event={"ID":"79373054-b0ae-47ba-a0f4-eb65a20d565b","Type":"ContainerDied","Data":"9b93fbd248a32309e729b97f188f83fbcf046a7e6e9a755531ff2d7fb94a0922"} Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.461972 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b93fbd248a32309e729b97f188f83fbcf046a7e6e9a755531ff2d7fb94a0922" Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.461609 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561938-22zck" Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.504362 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561932-hrzv2"] Mar 17 02:58:05 crc kubenswrapper[4735]: I0317 02:58:05.516358 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561932-hrzv2"] Mar 17 02:58:07 crc kubenswrapper[4735]: I0317 02:58:07.087794 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9554281-24fb-4dc3-b693-81e5b9199a04" path="/var/lib/kubelet/pods/f9554281-24fb-4dc3-b693-81e5b9199a04/volumes" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.633381 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:58:09 crc kubenswrapper[4735]: E0317 02:58:09.634643 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79373054-b0ae-47ba-a0f4-eb65a20d565b" containerName="oc" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.634664 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="79373054-b0ae-47ba-a0f4-eb65a20d565b" containerName="oc" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.634921 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="79373054-b0ae-47ba-a0f4-eb65a20d565b" containerName="oc" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.636742 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.648086 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.773063 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc99d\" (UniqueName: \"kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.773116 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.773285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.875318 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.875635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc99d\" (UniqueName: \"kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.875676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.876270 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.876306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.913369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc99d\" (UniqueName: \"kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d\") pod \"redhat-operators-bjtxg\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:09 crc kubenswrapper[4735]: I0317 02:58:09.989235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:10 crc kubenswrapper[4735]: I0317 02:58:10.489932 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:58:10 crc kubenswrapper[4735]: I0317 02:58:10.513794 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerStarted","Data":"aacd69f63df7200f21d4b04a4235da2e8442fde7fd9b67003e994fed462faeba"} Mar 17 02:58:11 crc kubenswrapper[4735]: I0317 02:58:11.538953 4735 generic.go:334] "Generic (PLEG): container finished" podID="a34404a1-eec7-4ce3-8417-abc336cef646" containerID="613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6" exitCode=0 Mar 17 02:58:11 crc kubenswrapper[4735]: I0317 02:58:11.539030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerDied","Data":"613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6"} Mar 17 02:58:12 crc kubenswrapper[4735]: I0317 02:58:12.606068 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:58:12 crc kubenswrapper[4735]: I0317 02:58:12.606361 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:58:13 crc kubenswrapper[4735]: I0317 02:58:13.706901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerStarted","Data":"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae"} Mar 17 02:58:18 crc kubenswrapper[4735]: I0317 02:58:18.766931 4735 generic.go:334] "Generic (PLEG): container finished" podID="a34404a1-eec7-4ce3-8417-abc336cef646" containerID="c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae" exitCode=0 Mar 17 02:58:18 crc kubenswrapper[4735]: I0317 02:58:18.767035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerDied","Data":"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae"} Mar 17 02:58:19 crc kubenswrapper[4735]: I0317 02:58:19.787965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerStarted","Data":"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156"} Mar 17 02:58:19 crc kubenswrapper[4735]: I0317 02:58:19.818094 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bjtxg" podStartSLOduration=3.197266948 podStartE2EDuration="10.818068748s" podCreationTimestamp="2026-03-17 02:58:09 +0000 UTC" firstStartedPulling="2026-03-17 02:58:11.543421037 +0000 UTC m=+6517.175654025" lastFinishedPulling="2026-03-17 02:58:19.164222807 +0000 UTC m=+6524.796455825" observedRunningTime="2026-03-17 02:58:19.812835172 +0000 UTC m=+6525.445068170" watchObservedRunningTime="2026-03-17 02:58:19.818068748 +0000 UTC m=+6525.450301746" Mar 17 02:58:19 crc kubenswrapper[4735]: I0317 02:58:19.989738 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:19 crc kubenswrapper[4735]: I0317 02:58:19.989787 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:58:21 crc kubenswrapper[4735]: I0317 02:58:21.060657 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjtxg" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" probeResult="failure" output=< Mar 17 02:58:21 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:58:21 crc kubenswrapper[4735]: > Mar 17 02:58:31 crc kubenswrapper[4735]: I0317 02:58:31.042304 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjtxg" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" probeResult="failure" output=< Mar 17 02:58:31 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:58:31 crc kubenswrapper[4735]: > Mar 17 02:58:41 crc kubenswrapper[4735]: I0317 02:58:41.054877 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjtxg" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" probeResult="failure" output=< Mar 17 02:58:41 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:58:41 crc kubenswrapper[4735]: > Mar 17 02:58:42 crc kubenswrapper[4735]: I0317 02:58:42.606194 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:58:42 crc kubenswrapper[4735]: I0317 02:58:42.606606 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:58:42 crc kubenswrapper[4735]: I0317 02:58:42.606674 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 02:58:42 crc kubenswrapper[4735]: I0317 02:58:42.607787 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:58:42 crc kubenswrapper[4735]: I0317 02:58:42.607937 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" gracePeriod=600 Mar 17 02:58:42 crc kubenswrapper[4735]: E0317 02:58:42.732827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:58:43 crc kubenswrapper[4735]: I0317 02:58:43.047129 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" exitCode=0 Mar 17 02:58:43 crc kubenswrapper[4735]: I0317 02:58:43.047189 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891"} Mar 17 02:58:43 crc kubenswrapper[4735]: I0317 02:58:43.047268 4735 scope.go:117] "RemoveContainer" containerID="6336d8673290b23c0852972ac6ab5890a1b7cdbc464ec54e27ad8a56080a4dc6" Mar 17 02:58:43 crc kubenswrapper[4735]: I0317 02:58:43.048849 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:58:43 crc kubenswrapper[4735]: E0317 02:58:43.049640 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:58:51 crc kubenswrapper[4735]: I0317 02:58:51.066136 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjtxg" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" probeResult="failure" output=< Mar 17 02:58:51 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:58:51 crc kubenswrapper[4735]: > Mar 17 02:58:57 crc kubenswrapper[4735]: I0317 02:58:57.073765 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:58:57 crc kubenswrapper[4735]: E0317 02:58:57.075206 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:59:00 crc kubenswrapper[4735]: I0317 02:59:00.049665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:59:00 crc kubenswrapper[4735]: I0317 02:59:00.118980 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:59:00 crc kubenswrapper[4735]: I0317 02:59:00.308660 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:59:01 crc kubenswrapper[4735]: I0317 02:59:01.247718 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bjtxg" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" containerID="cri-o://8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156" gracePeriod=2 Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.064885 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.249990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content\") pod \"a34404a1-eec7-4ce3-8417-abc336cef646\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.250059 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities\") pod \"a34404a1-eec7-4ce3-8417-abc336cef646\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.250253 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc99d\" (UniqueName: \"kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d\") pod \"a34404a1-eec7-4ce3-8417-abc336cef646\" (UID: \"a34404a1-eec7-4ce3-8417-abc336cef646\") " Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.251067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities" (OuterVolumeSpecName: "utilities") pod "a34404a1-eec7-4ce3-8417-abc336cef646" (UID: "a34404a1-eec7-4ce3-8417-abc336cef646"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.275321 4735 generic.go:334] "Generic (PLEG): container finished" podID="a34404a1-eec7-4ce3-8417-abc336cef646" containerID="8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156" exitCode=0 Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.275377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerDied","Data":"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156"} Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.275415 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjtxg" event={"ID":"a34404a1-eec7-4ce3-8417-abc336cef646","Type":"ContainerDied","Data":"aacd69f63df7200f21d4b04a4235da2e8442fde7fd9b67003e994fed462faeba"} Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.275436 4735 scope.go:117] "RemoveContainer" containerID="8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.275555 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjtxg" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.277320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d" (OuterVolumeSpecName: "kube-api-access-tc99d") pod "a34404a1-eec7-4ce3-8417-abc336cef646" (UID: "a34404a1-eec7-4ce3-8417-abc336cef646"). InnerVolumeSpecName "kube-api-access-tc99d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.330037 4735 scope.go:117] "RemoveContainer" containerID="c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.352969 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.353006 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc99d\" (UniqueName: \"kubernetes.io/projected/a34404a1-eec7-4ce3-8417-abc336cef646-kube-api-access-tc99d\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.353617 4735 scope.go:117] "RemoveContainer" containerID="613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.400656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a34404a1-eec7-4ce3-8417-abc336cef646" (UID: "a34404a1-eec7-4ce3-8417-abc336cef646"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.424753 4735 scope.go:117] "RemoveContainer" containerID="8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156" Mar 17 02:59:02 crc kubenswrapper[4735]: E0317 02:59:02.425398 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156\": container with ID starting with 8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156 not found: ID does not exist" containerID="8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.425545 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156"} err="failed to get container status \"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156\": rpc error: code = NotFound desc = could not find container \"8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156\": container with ID starting with 8908210d261ff0bbaec1a6dc16815d899033e41c65b13faa84ba9b054b669156 not found: ID does not exist" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.425576 4735 scope.go:117] "RemoveContainer" containerID="c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae" Mar 17 02:59:02 crc kubenswrapper[4735]: E0317 02:59:02.426564 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae\": container with ID starting with c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae not found: ID does not exist" containerID="c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.426629 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae"} err="failed to get container status \"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae\": rpc error: code = NotFound desc = could not find container \"c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae\": container with ID starting with c0474c2e8bf25441184b23a53e547362076ed5bc653aa61f6ed67e5b230bf3ae not found: ID does not exist" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.426665 4735 scope.go:117] "RemoveContainer" containerID="613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6" Mar 17 02:59:02 crc kubenswrapper[4735]: E0317 02:59:02.427130 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6\": container with ID starting with 613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6 not found: ID does not exist" containerID="613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.427172 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6"} err="failed to get container status \"613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6\": rpc error: code = NotFound desc = could not find container \"613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6\": container with ID starting with 613b14a4dc6bee8360fa6efe13ab51b435609a4943381882372e8f1340ea9ee6 not found: ID does not exist" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.454290 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34404a1-eec7-4ce3-8417-abc336cef646-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.622770 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:59:02 crc kubenswrapper[4735]: I0317 02:59:02.632741 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bjtxg"] Mar 17 02:59:03 crc kubenswrapper[4735]: I0317 02:59:03.098391 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" path="/var/lib/kubelet/pods/a34404a1-eec7-4ce3-8417-abc336cef646/volumes" Mar 17 02:59:04 crc kubenswrapper[4735]: I0317 02:59:04.847751 4735 scope.go:117] "RemoveContainer" containerID="e1f949776bfa58b3bf940c70e662a5a9a424f8770ab10c9e24d98fa7d21021dd" Mar 17 02:59:11 crc kubenswrapper[4735]: I0317 02:59:11.076389 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:59:11 crc kubenswrapper[4735]: E0317 02:59:11.079624 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:59:24 crc kubenswrapper[4735]: I0317 02:59:24.073077 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:59:24 crc kubenswrapper[4735]: E0317 02:59:24.073849 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.334710 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 02:59:31 crc kubenswrapper[4735]: E0317 02:59:31.335519 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="extract-utilities" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.335532 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="extract-utilities" Mar 17 02:59:31 crc kubenswrapper[4735]: E0317 02:59:31.335548 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.335554 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" Mar 17 02:59:31 crc kubenswrapper[4735]: E0317 02:59:31.335587 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="extract-content" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.335594 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="extract-content" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.335774 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34404a1-eec7-4ce3-8417-abc336cef646" containerName="registry-server" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.337059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.347265 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.475930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlm29\" (UniqueName: \"kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.476005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.476038 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.577711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlm29\" (UniqueName: \"kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.577830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.577901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.578461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.578611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.614004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlm29\" (UniqueName: \"kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29\") pod \"community-operators-jxvlc\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:31 crc kubenswrapper[4735]: I0317 02:59:31.730468 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:32 crc kubenswrapper[4735]: I0317 02:59:32.271269 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 02:59:32 crc kubenswrapper[4735]: I0317 02:59:32.633677 4735 generic.go:334] "Generic (PLEG): container finished" podID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerID="3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830" exitCode=0 Mar 17 02:59:32 crc kubenswrapper[4735]: I0317 02:59:32.633744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerDied","Data":"3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830"} Mar 17 02:59:32 crc kubenswrapper[4735]: I0317 02:59:32.634089 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerStarted","Data":"8b580c67c2de3bddfcc304dd4792aa80aba4dcb300bcf59439fb7c840e33a5a9"} Mar 17 02:59:37 crc kubenswrapper[4735]: I0317 02:59:37.682260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerStarted","Data":"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7"} Mar 17 02:59:39 crc kubenswrapper[4735]: I0317 02:59:39.075772 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:59:39 crc kubenswrapper[4735]: E0317 02:59:39.076127 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:59:39 crc kubenswrapper[4735]: I0317 02:59:39.716345 4735 generic.go:334] "Generic (PLEG): container finished" podID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerID="b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7" exitCode=0 Mar 17 02:59:39 crc kubenswrapper[4735]: I0317 02:59:39.716419 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerDied","Data":"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7"} Mar 17 02:59:40 crc kubenswrapper[4735]: I0317 02:59:40.730171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerStarted","Data":"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892"} Mar 17 02:59:40 crc kubenswrapper[4735]: I0317 02:59:40.767071 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jxvlc" podStartSLOduration=2.20934653 podStartE2EDuration="9.767046952s" podCreationTimestamp="2026-03-17 02:59:31 +0000 UTC" firstStartedPulling="2026-03-17 02:59:32.636710333 +0000 UTC m=+6598.268943341" lastFinishedPulling="2026-03-17 02:59:40.194410795 +0000 UTC m=+6605.826643763" observedRunningTime="2026-03-17 02:59:40.755800182 +0000 UTC m=+6606.388033170" watchObservedRunningTime="2026-03-17 02:59:40.767046952 +0000 UTC m=+6606.399279930" Mar 17 02:59:41 crc kubenswrapper[4735]: I0317 02:59:41.731049 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:41 crc kubenswrapper[4735]: I0317 02:59:41.731136 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:42 crc kubenswrapper[4735]: I0317 02:59:42.798664 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jxvlc" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="registry-server" probeResult="failure" output=< Mar 17 02:59:42 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 02:59:42 crc kubenswrapper[4735]: > Mar 17 02:59:51 crc kubenswrapper[4735]: I0317 02:59:51.074043 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 02:59:51 crc kubenswrapper[4735]: E0317 02:59:51.075203 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 02:59:51 crc kubenswrapper[4735]: I0317 02:59:51.832358 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:51 crc kubenswrapper[4735]: I0317 02:59:51.917647 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.021371 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.094500 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.094832 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7phjv" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" containerID="cri-o://d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77" gracePeriod=2 Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.807018 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.877677 4735 generic.go:334] "Generic (PLEG): container finished" podID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerID="d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77" exitCode=0 Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.879165 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7phjv" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.880186 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerDied","Data":"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77"} Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.882341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7phjv" event={"ID":"4dd837c7-abaf-4b93-a871-7f9a370120c9","Type":"ContainerDied","Data":"7d16fd54c985cc2af3f16379373cb4393090f7adbbe1ad22618d6fef9a5bf3a5"} Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.882373 4735 scope.go:117] "RemoveContainer" containerID="d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.942040 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities\") pod \"4dd837c7-abaf-4b93-a871-7f9a370120c9\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.942126 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvhm4\" (UniqueName: \"kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4\") pod \"4dd837c7-abaf-4b93-a871-7f9a370120c9\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.942174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content\") pod \"4dd837c7-abaf-4b93-a871-7f9a370120c9\" (UID: \"4dd837c7-abaf-4b93-a871-7f9a370120c9\") " Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.942834 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities" (OuterVolumeSpecName: "utilities") pod "4dd837c7-abaf-4b93-a871-7f9a370120c9" (UID: "4dd837c7-abaf-4b93-a871-7f9a370120c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.959066 4735 scope.go:117] "RemoveContainer" containerID="77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf" Mar 17 02:59:52 crc kubenswrapper[4735]: I0317 02:59:52.959831 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4" (OuterVolumeSpecName: "kube-api-access-vvhm4") pod "4dd837c7-abaf-4b93-a871-7f9a370120c9" (UID: "4dd837c7-abaf-4b93-a871-7f9a370120c9"). InnerVolumeSpecName "kube-api-access-vvhm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.015077 4735 scope.go:117] "RemoveContainer" containerID="7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.031591 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dd837c7-abaf-4b93-a871-7f9a370120c9" (UID: "4dd837c7-abaf-4b93-a871-7f9a370120c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.045267 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvhm4\" (UniqueName: \"kubernetes.io/projected/4dd837c7-abaf-4b93-a871-7f9a370120c9-kube-api-access-vvhm4\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.045299 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.045308 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd837c7-abaf-4b93-a871-7f9a370120c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.080595 4735 scope.go:117] "RemoveContainer" containerID="d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77" Mar 17 02:59:53 crc kubenswrapper[4735]: E0317 02:59:53.081465 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77\": container with ID starting with d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77 not found: ID does not exist" containerID="d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.081492 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77"} err="failed to get container status \"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77\": rpc error: code = NotFound desc = could not find container \"d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77\": container with ID starting with d415a1e927276ca841da75cbe29a125dac4dfb08b3384fa9400095b561fa4e77 not found: ID does not exist" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.081509 4735 scope.go:117] "RemoveContainer" containerID="77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf" Mar 17 02:59:53 crc kubenswrapper[4735]: E0317 02:59:53.081917 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf\": container with ID starting with 77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf not found: ID does not exist" containerID="77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.081942 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf"} err="failed to get container status \"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf\": rpc error: code = NotFound desc = could not find container \"77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf\": container with ID starting with 77a920a1e0748c53e8b3219f6802be299789f470b264b960422a8cbe9ab27caf not found: ID does not exist" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.081956 4735 scope.go:117] "RemoveContainer" containerID="7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9" Mar 17 02:59:53 crc kubenswrapper[4735]: E0317 02:59:53.082402 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9\": container with ID starting with 7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9 not found: ID does not exist" containerID="7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.082422 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9"} err="failed to get container status \"7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9\": rpc error: code = NotFound desc = could not find container \"7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9\": container with ID starting with 7ac6af871961bc8633a8ad5f1b737482d927aa1f190210d84e7c642b70909de9 not found: ID does not exist" Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.206121 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:59:53 crc kubenswrapper[4735]: I0317 02:59:53.213370 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7phjv"] Mar 17 02:59:55 crc kubenswrapper[4735]: I0317 02:59:55.087382 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" path="/var/lib/kubelet/pods/4dd837c7-abaf-4b93-a871-7f9a370120c9/volumes" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.192913 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561940-b4kfz"] Mar 17 03:00:00 crc kubenswrapper[4735]: E0317 03:00:00.193851 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.193880 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" Mar 17 03:00:00 crc kubenswrapper[4735]: E0317 03:00:00.193910 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="extract-utilities" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.193916 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="extract-utilities" Mar 17 03:00:00 crc kubenswrapper[4735]: E0317 03:00:00.193932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="extract-content" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.193941 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="extract-content" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.194168 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd837c7-abaf-4b93-a871-7f9a370120c9" containerName="registry-server" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.194797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.202835 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499"] Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.204093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.213071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561940-b4kfz"] Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.217518 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.217536 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.220040 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.220103 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.223345 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.243720 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499"] Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.292570 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4kr\" (UniqueName: \"kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.292665 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.292713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.292772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvbg\" (UniqueName: \"kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg\") pod \"auto-csr-approver-29561940-b4kfz\" (UID: \"98065f64-473d-47dd-b1e4-02cdf53a3fc0\") " pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.394118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4kr\" (UniqueName: \"kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.394225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.394270 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.394328 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvbg\" (UniqueName: \"kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg\") pod \"auto-csr-approver-29561940-b4kfz\" (UID: \"98065f64-473d-47dd-b1e4-02cdf53a3fc0\") " pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.395967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.407107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.409783 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvbg\" (UniqueName: \"kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg\") pod \"auto-csr-approver-29561940-b4kfz\" (UID: \"98065f64-473d-47dd-b1e4-02cdf53a3fc0\") " pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.412105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4kr\" (UniqueName: \"kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr\") pod \"collect-profiles-29561940-fh499\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.513450 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:00 crc kubenswrapper[4735]: I0317 03:00:00.523501 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.410815 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499"] Mar 17 03:00:01 crc kubenswrapper[4735]: W0317 03:00:01.415041 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d58c75f_83cc_4cc0_8a39_8f58353bbd0e.slice/crio-7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9 WatchSource:0}: Error finding container 7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9: Status 404 returned error can't find the container with id 7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9 Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.476933 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561940-b4kfz"] Mar 17 03:00:01 crc kubenswrapper[4735]: W0317 03:00:01.485106 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98065f64_473d_47dd_b1e4_02cdf53a3fc0.slice/crio-17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506 WatchSource:0}: Error finding container 17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506: Status 404 returned error can't find the container with id 17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506 Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.971396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" event={"ID":"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e","Type":"ContainerStarted","Data":"3dc99e2b4d331318bddc8f10fcf4261b7e8faed32ca6f024442a00de13af397c"} Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.971438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" event={"ID":"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e","Type":"ContainerStarted","Data":"7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9"} Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.973427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" event={"ID":"98065f64-473d-47dd-b1e4-02cdf53a3fc0","Type":"ContainerStarted","Data":"17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506"} Mar 17 03:00:01 crc kubenswrapper[4735]: I0317 03:00:01.990952 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" podStartSLOduration=1.990934365 podStartE2EDuration="1.990934365s" podCreationTimestamp="2026-03-17 03:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 03:00:01.987568194 +0000 UTC m=+6627.619801162" watchObservedRunningTime="2026-03-17 03:00:01.990934365 +0000 UTC m=+6627.623167343" Mar 17 03:00:02 crc kubenswrapper[4735]: I0317 03:00:02.073472 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:00:02 crc kubenswrapper[4735]: E0317 03:00:02.073761 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:00:02 crc kubenswrapper[4735]: I0317 03:00:02.981380 4735 generic.go:334] "Generic (PLEG): container finished" podID="9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" containerID="3dc99e2b4d331318bddc8f10fcf4261b7e8faed32ca6f024442a00de13af397c" exitCode=0 Mar 17 03:00:02 crc kubenswrapper[4735]: I0317 03:00:02.981422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" event={"ID":"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e","Type":"ContainerDied","Data":"3dc99e2b4d331318bddc8f10fcf4261b7e8faed32ca6f024442a00de13af397c"} Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.361037 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.475986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume\") pod \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.476270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl4kr\" (UniqueName: \"kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr\") pod \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.477297 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume\") pod \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\" (UID: \"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e\") " Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.483594 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume" (OuterVolumeSpecName: "config-volume") pod "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" (UID: "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.494870 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd"] Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.504029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr" (OuterVolumeSpecName: "kube-api-access-sl4kr") pod "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" (UID: "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e"). InnerVolumeSpecName "kube-api-access-sl4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.504961 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-mdjwd"] Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.508665 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" (UID: "9d58c75f-83cc-4cc0-8a39-8f58353bbd0e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.580285 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.580519 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:00:04 crc kubenswrapper[4735]: I0317 03:00:04.580605 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl4kr\" (UniqueName: \"kubernetes.io/projected/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e-kube-api-access-sl4kr\") on node \"crc\" DevicePath \"\"" Mar 17 03:00:05 crc kubenswrapper[4735]: I0317 03:00:05.009546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" event={"ID":"9d58c75f-83cc-4cc0-8a39-8f58353bbd0e","Type":"ContainerDied","Data":"7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9"} Mar 17 03:00:05 crc kubenswrapper[4735]: I0317 03:00:05.009719 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7435066c7ac9404ecffe809a7613fabd92c3c15933ff5faad4b3750cfcb6daf9" Mar 17 03:00:05 crc kubenswrapper[4735]: I0317 03:00:05.009843 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499" Mar 17 03:00:05 crc kubenswrapper[4735]: I0317 03:00:05.091199 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be5fc0c-e7b0-45cd-9fce-ed5dde74a261" path="/var/lib/kubelet/pods/3be5fc0c-e7b0-45cd-9fce-ed5dde74a261/volumes" Mar 17 03:00:16 crc kubenswrapper[4735]: I0317 03:00:16.073428 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:00:16 crc kubenswrapper[4735]: E0317 03:00:16.074586 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:00:23 crc kubenswrapper[4735]: I0317 03:00:23.236754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" event={"ID":"98065f64-473d-47dd-b1e4-02cdf53a3fc0","Type":"ContainerStarted","Data":"cf68af1a0cd5ecc74a86674359a1ee1f4d54729a7654cbc575bfc9ee5db98754"} Mar 17 03:00:23 crc kubenswrapper[4735]: I0317 03:00:23.269041 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" podStartSLOduration=2.225821407 podStartE2EDuration="23.269019579s" podCreationTimestamp="2026-03-17 03:00:00 +0000 UTC" firstStartedPulling="2026-03-17 03:00:01.488281942 +0000 UTC m=+6627.120514920" lastFinishedPulling="2026-03-17 03:00:22.531480114 +0000 UTC m=+6648.163713092" observedRunningTime="2026-03-17 03:00:23.257349089 +0000 UTC m=+6648.889582057" watchObservedRunningTime="2026-03-17 03:00:23.269019579 +0000 UTC m=+6648.901252567" Mar 17 03:00:25 crc kubenswrapper[4735]: I0317 03:00:25.263013 4735 generic.go:334] "Generic (PLEG): container finished" podID="98065f64-473d-47dd-b1e4-02cdf53a3fc0" containerID="cf68af1a0cd5ecc74a86674359a1ee1f4d54729a7654cbc575bfc9ee5db98754" exitCode=0 Mar 17 03:00:25 crc kubenswrapper[4735]: I0317 03:00:25.263116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" event={"ID":"98065f64-473d-47dd-b1e4-02cdf53a3fc0","Type":"ContainerDied","Data":"cf68af1a0cd5ecc74a86674359a1ee1f4d54729a7654cbc575bfc9ee5db98754"} Mar 17 03:00:26 crc kubenswrapper[4735]: I0317 03:00:26.713358 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:26 crc kubenswrapper[4735]: I0317 03:00:26.783309 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhvbg\" (UniqueName: \"kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg\") pod \"98065f64-473d-47dd-b1e4-02cdf53a3fc0\" (UID: \"98065f64-473d-47dd-b1e4-02cdf53a3fc0\") " Mar 17 03:00:26 crc kubenswrapper[4735]: I0317 03:00:26.788817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg" (OuterVolumeSpecName: "kube-api-access-bhvbg") pod "98065f64-473d-47dd-b1e4-02cdf53a3fc0" (UID: "98065f64-473d-47dd-b1e4-02cdf53a3fc0"). InnerVolumeSpecName "kube-api-access-bhvbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:00:26 crc kubenswrapper[4735]: I0317 03:00:26.886271 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhvbg\" (UniqueName: \"kubernetes.io/projected/98065f64-473d-47dd-b1e4-02cdf53a3fc0-kube-api-access-bhvbg\") on node \"crc\" DevicePath \"\"" Mar 17 03:00:27 crc kubenswrapper[4735]: I0317 03:00:27.287304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" event={"ID":"98065f64-473d-47dd-b1e4-02cdf53a3fc0","Type":"ContainerDied","Data":"17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506"} Mar 17 03:00:27 crc kubenswrapper[4735]: I0317 03:00:27.287369 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17773b310197be41b102b2ee1d3c62ee666b9cb754540ca4807171eeb4c7e506" Mar 17 03:00:27 crc kubenswrapper[4735]: I0317 03:00:27.287465 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561940-b4kfz" Mar 17 03:00:27 crc kubenswrapper[4735]: I0317 03:00:27.395390 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561934-7hm7j"] Mar 17 03:00:27 crc kubenswrapper[4735]: I0317 03:00:27.416090 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561934-7hm7j"] Mar 17 03:00:28 crc kubenswrapper[4735]: I0317 03:00:28.073297 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:00:28 crc kubenswrapper[4735]: E0317 03:00:28.074048 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:00:29 crc kubenswrapper[4735]: I0317 03:00:29.089608 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a362e14-7198-4129-a8a7-99b0ae1f45eb" path="/var/lib/kubelet/pods/9a362e14-7198-4129-a8a7-99b0ae1f45eb/volumes" Mar 17 03:00:40 crc kubenswrapper[4735]: I0317 03:00:40.073313 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:00:40 crc kubenswrapper[4735]: E0317 03:00:40.074170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:00:52 crc kubenswrapper[4735]: I0317 03:00:52.073940 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:00:52 crc kubenswrapper[4735]: E0317 03:00:52.074925 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.166913 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29561941-shsnj"] Mar 17 03:01:00 crc kubenswrapper[4735]: E0317 03:01:00.167680 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98065f64-473d-47dd-b1e4-02cdf53a3fc0" containerName="oc" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.167693 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="98065f64-473d-47dd-b1e4-02cdf53a3fc0" containerName="oc" Mar 17 03:01:00 crc kubenswrapper[4735]: E0317 03:01:00.167716 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" containerName="collect-profiles" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.167722 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" containerName="collect-profiles" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.167931 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" containerName="collect-profiles" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.167946 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="98065f64-473d-47dd-b1e4-02cdf53a3fc0" containerName="oc" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.168632 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.185624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561941-shsnj"] Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.310400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.310477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.310526 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.310562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47fj6\" (UniqueName: \"kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.411788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.412081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.412214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.412321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47fj6\" (UniqueName: \"kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.419613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.421589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.429001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.436784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47fj6\" (UniqueName: \"kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6\") pod \"keystone-cron-29561941-shsnj\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.492167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:00 crc kubenswrapper[4735]: I0317 03:01:00.958490 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561941-shsnj"] Mar 17 03:01:01 crc kubenswrapper[4735]: I0317 03:01:01.633543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561941-shsnj" event={"ID":"fde14ac4-f83c-412e-ba07-d1ce0e8368d6","Type":"ContainerStarted","Data":"cc613128f3f627dd0681bf1b9175b3252a87a335d97669370a9da8a4aa48782a"} Mar 17 03:01:01 crc kubenswrapper[4735]: I0317 03:01:01.633886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561941-shsnj" event={"ID":"fde14ac4-f83c-412e-ba07-d1ce0e8368d6","Type":"ContainerStarted","Data":"fc3c9b70bb371443ffd7a4b0fab2428acfaf4867990411430ca7b4a9983c0e6b"} Mar 17 03:01:01 crc kubenswrapper[4735]: I0317 03:01:01.671219 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29561941-shsnj" podStartSLOduration=1.671189719 podStartE2EDuration="1.671189719s" podCreationTimestamp="2026-03-17 03:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 03:01:01.661209019 +0000 UTC m=+6687.293441997" watchObservedRunningTime="2026-03-17 03:01:01.671189719 +0000 UTC m=+6687.303422737" Mar 17 03:01:03 crc kubenswrapper[4735]: I0317 03:01:03.074570 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:01:03 crc kubenswrapper[4735]: E0317 03:01:03.075121 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:01:05 crc kubenswrapper[4735]: I0317 03:01:05.052418 4735 scope.go:117] "RemoveContainer" containerID="24d9baa8b7d2213be9e054dd025798a89fa5b7e6ddddb725343075df14647158" Mar 17 03:01:05 crc kubenswrapper[4735]: I0317 03:01:05.103028 4735 scope.go:117] "RemoveContainer" containerID="f34d6f2c3caec0f505484a3deb500b2505d65a334c7f4d594c950618ec32c808" Mar 17 03:01:05 crc kubenswrapper[4735]: I0317 03:01:05.665322 4735 generic.go:334] "Generic (PLEG): container finished" podID="fde14ac4-f83c-412e-ba07-d1ce0e8368d6" containerID="cc613128f3f627dd0681bf1b9175b3252a87a335d97669370a9da8a4aa48782a" exitCode=0 Mar 17 03:01:05 crc kubenswrapper[4735]: I0317 03:01:05.665414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561941-shsnj" event={"ID":"fde14ac4-f83c-412e-ba07-d1ce0e8368d6","Type":"ContainerDied","Data":"cc613128f3f627dd0681bf1b9175b3252a87a335d97669370a9da8a4aa48782a"} Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.225701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.350632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data\") pod \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.350743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47fj6\" (UniqueName: \"kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6\") pod \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.350906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys\") pod \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.350940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle\") pod \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\" (UID: \"fde14ac4-f83c-412e-ba07-d1ce0e8368d6\") " Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.357967 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fde14ac4-f83c-412e-ba07-d1ce0e8368d6" (UID: "fde14ac4-f83c-412e-ba07-d1ce0e8368d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.360539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6" (OuterVolumeSpecName: "kube-api-access-47fj6") pod "fde14ac4-f83c-412e-ba07-d1ce0e8368d6" (UID: "fde14ac4-f83c-412e-ba07-d1ce0e8368d6"). InnerVolumeSpecName "kube-api-access-47fj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.392076 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fde14ac4-f83c-412e-ba07-d1ce0e8368d6" (UID: "fde14ac4-f83c-412e-ba07-d1ce0e8368d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.421051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data" (OuterVolumeSpecName: "config-data") pod "fde14ac4-f83c-412e-ba07-d1ce0e8368d6" (UID: "fde14ac4-f83c-412e-ba07-d1ce0e8368d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.453698 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.453799 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47fj6\" (UniqueName: \"kubernetes.io/projected/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-kube-api-access-47fj6\") on node \"crc\" DevicePath \"\"" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.453872 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.453927 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde14ac4-f83c-412e-ba07-d1ce0e8368d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.686921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561941-shsnj" event={"ID":"fde14ac4-f83c-412e-ba07-d1ce0e8368d6","Type":"ContainerDied","Data":"fc3c9b70bb371443ffd7a4b0fab2428acfaf4867990411430ca7b4a9983c0e6b"} Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.687216 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc3c9b70bb371443ffd7a4b0fab2428acfaf4867990411430ca7b4a9983c0e6b" Mar 17 03:01:07 crc kubenswrapper[4735]: I0317 03:01:07.687154 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561941-shsnj" Mar 17 03:01:18 crc kubenswrapper[4735]: I0317 03:01:18.073241 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:01:18 crc kubenswrapper[4735]: E0317 03:01:18.074123 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:01:32 crc kubenswrapper[4735]: I0317 03:01:32.073962 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:01:32 crc kubenswrapper[4735]: E0317 03:01:32.075131 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:01:43 crc kubenswrapper[4735]: I0317 03:01:43.073581 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:01:43 crc kubenswrapper[4735]: E0317 03:01:43.074503 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:01:56 crc kubenswrapper[4735]: I0317 03:01:56.073486 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:01:56 crc kubenswrapper[4735]: E0317 03:01:56.074348 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.170630 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561942-5wgmt"] Mar 17 03:02:00 crc kubenswrapper[4735]: E0317 03:02:00.171573 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde14ac4-f83c-412e-ba07-d1ce0e8368d6" containerName="keystone-cron" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.171585 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde14ac4-f83c-412e-ba07-d1ce0e8368d6" containerName="keystone-cron" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.171793 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde14ac4-f83c-412e-ba07-d1ce0e8368d6" containerName="keystone-cron" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.172370 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.174799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.175027 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.185281 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.192433 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561942-5wgmt"] Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.318538 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmlfl\" (UniqueName: \"kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl\") pod \"auto-csr-approver-29561942-5wgmt\" (UID: \"5617a772-5a13-4cea-a8d0-670299f82028\") " pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.420393 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmlfl\" (UniqueName: \"kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl\") pod \"auto-csr-approver-29561942-5wgmt\" (UID: \"5617a772-5a13-4cea-a8d0-670299f82028\") " pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.453528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmlfl\" (UniqueName: \"kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl\") pod \"auto-csr-approver-29561942-5wgmt\" (UID: \"5617a772-5a13-4cea-a8d0-670299f82028\") " pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:00 crc kubenswrapper[4735]: I0317 03:02:00.499752 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:01 crc kubenswrapper[4735]: I0317 03:02:01.012853 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561942-5wgmt"] Mar 17 03:02:01 crc kubenswrapper[4735]: I0317 03:02:01.221607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" event={"ID":"5617a772-5a13-4cea-a8d0-670299f82028","Type":"ContainerStarted","Data":"d516551dec1ace7d96450690d3311e92fe1121ece06fc172e7470aad0d4b70ef"} Mar 17 03:02:03 crc kubenswrapper[4735]: I0317 03:02:03.241570 4735 generic.go:334] "Generic (PLEG): container finished" podID="5617a772-5a13-4cea-a8d0-670299f82028" containerID="e8def1ca637278f228ab727aeaa4534fb6ffee2fc9e132729e461b28eee25b70" exitCode=0 Mar 17 03:02:03 crc kubenswrapper[4735]: I0317 03:02:03.241769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" event={"ID":"5617a772-5a13-4cea-a8d0-670299f82028","Type":"ContainerDied","Data":"e8def1ca637278f228ab727aeaa4534fb6ffee2fc9e132729e461b28eee25b70"} Mar 17 03:02:04 crc kubenswrapper[4735]: I0317 03:02:04.620396 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:04 crc kubenswrapper[4735]: I0317 03:02:04.734499 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmlfl\" (UniqueName: \"kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl\") pod \"5617a772-5a13-4cea-a8d0-670299f82028\" (UID: \"5617a772-5a13-4cea-a8d0-670299f82028\") " Mar 17 03:02:04 crc kubenswrapper[4735]: I0317 03:02:04.742063 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl" (OuterVolumeSpecName: "kube-api-access-qmlfl") pod "5617a772-5a13-4cea-a8d0-670299f82028" (UID: "5617a772-5a13-4cea-a8d0-670299f82028"). InnerVolumeSpecName "kube-api-access-qmlfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:02:04 crc kubenswrapper[4735]: I0317 03:02:04.836407 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmlfl\" (UniqueName: \"kubernetes.io/projected/5617a772-5a13-4cea-a8d0-670299f82028-kube-api-access-qmlfl\") on node \"crc\" DevicePath \"\"" Mar 17 03:02:05 crc kubenswrapper[4735]: I0317 03:02:05.268491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" event={"ID":"5617a772-5a13-4cea-a8d0-670299f82028","Type":"ContainerDied","Data":"d516551dec1ace7d96450690d3311e92fe1121ece06fc172e7470aad0d4b70ef"} Mar 17 03:02:05 crc kubenswrapper[4735]: I0317 03:02:05.268552 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d516551dec1ace7d96450690d3311e92fe1121ece06fc172e7470aad0d4b70ef" Mar 17 03:02:05 crc kubenswrapper[4735]: I0317 03:02:05.268634 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561942-5wgmt" Mar 17 03:02:05 crc kubenswrapper[4735]: I0317 03:02:05.712658 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561936-cshbt"] Mar 17 03:02:05 crc kubenswrapper[4735]: I0317 03:02:05.722778 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561936-cshbt"] Mar 17 03:02:07 crc kubenswrapper[4735]: I0317 03:02:07.090738 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06" path="/var/lib/kubelet/pods/4f87fec2-4ad3-4fe1-97b9-fd26a7f69d06/volumes" Mar 17 03:02:08 crc kubenswrapper[4735]: I0317 03:02:08.073087 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:02:08 crc kubenswrapper[4735]: E0317 03:02:08.073551 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:02:23 crc kubenswrapper[4735]: I0317 03:02:23.072699 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:02:23 crc kubenswrapper[4735]: E0317 03:02:23.074538 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:02:38 crc kubenswrapper[4735]: I0317 03:02:38.074334 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:02:38 crc kubenswrapper[4735]: E0317 03:02:38.075370 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:02:50 crc kubenswrapper[4735]: I0317 03:02:50.073776 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:02:50 crc kubenswrapper[4735]: E0317 03:02:50.074609 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:03:01 crc kubenswrapper[4735]: I0317 03:03:01.073566 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:03:01 crc kubenswrapper[4735]: E0317 03:03:01.074810 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:03:05 crc kubenswrapper[4735]: I0317 03:03:05.207701 4735 scope.go:117] "RemoveContainer" containerID="330c6d7ffab67d0451139e6709bb407354a5b8d40450934bc8ffd8d129c66ae4" Mar 17 03:03:12 crc kubenswrapper[4735]: I0317 03:03:12.073653 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:03:12 crc kubenswrapper[4735]: E0317 03:03:12.074751 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:03:27 crc kubenswrapper[4735]: I0317 03:03:27.074189 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:03:27 crc kubenswrapper[4735]: E0317 03:03:27.075269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:03:36 crc kubenswrapper[4735]: I0317 03:03:36.817339 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:36 crc kubenswrapper[4735]: E0317 03:03:36.818946 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5617a772-5a13-4cea-a8d0-670299f82028" containerName="oc" Mar 17 03:03:36 crc kubenswrapper[4735]: I0317 03:03:36.818981 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5617a772-5a13-4cea-a8d0-670299f82028" containerName="oc" Mar 17 03:03:36 crc kubenswrapper[4735]: I0317 03:03:36.834267 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5617a772-5a13-4cea-a8d0-670299f82028" containerName="oc" Mar 17 03:03:36 crc kubenswrapper[4735]: I0317 03:03:36.836930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:36 crc kubenswrapper[4735]: I0317 03:03:36.846293 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.028712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.028937 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.029089 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fjf\" (UniqueName: \"kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.130805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.130885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.130919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fjf\" (UniqueName: \"kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.131284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.131363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.155824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fjf\" (UniqueName: \"kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf\") pod \"redhat-marketplace-2rcz9\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.192427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:37 crc kubenswrapper[4735]: I0317 03:03:37.717639 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:38 crc kubenswrapper[4735]: I0317 03:03:38.263637 4735 generic.go:334] "Generic (PLEG): container finished" podID="1293c721-9574-406d-900b-44cd49edd970" containerID="0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01" exitCode=0 Mar 17 03:03:38 crc kubenswrapper[4735]: I0317 03:03:38.263943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerDied","Data":"0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01"} Mar 17 03:03:38 crc kubenswrapper[4735]: I0317 03:03:38.263969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerStarted","Data":"b03fb40bf679270d4f303f656d38353c4d2188323d9bc594767a6a310edd8eab"} Mar 17 03:03:38 crc kubenswrapper[4735]: I0317 03:03:38.270899 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:03:40 crc kubenswrapper[4735]: I0317 03:03:40.073611 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:03:40 crc kubenswrapper[4735]: E0317 03:03:40.074714 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:03:40 crc kubenswrapper[4735]: I0317 03:03:40.290312 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerStarted","Data":"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62"} Mar 17 03:03:41 crc kubenswrapper[4735]: I0317 03:03:41.300935 4735 generic.go:334] "Generic (PLEG): container finished" podID="1293c721-9574-406d-900b-44cd49edd970" containerID="bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62" exitCode=0 Mar 17 03:03:41 crc kubenswrapper[4735]: I0317 03:03:41.300987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerDied","Data":"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62"} Mar 17 03:03:42 crc kubenswrapper[4735]: I0317 03:03:42.311649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerStarted","Data":"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d"} Mar 17 03:03:42 crc kubenswrapper[4735]: I0317 03:03:42.343530 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2rcz9" podStartSLOduration=2.903441647 podStartE2EDuration="6.343512533s" podCreationTimestamp="2026-03-17 03:03:36 +0000 UTC" firstStartedPulling="2026-03-17 03:03:38.268167943 +0000 UTC m=+6843.900400961" lastFinishedPulling="2026-03-17 03:03:41.708238839 +0000 UTC m=+6847.340471847" observedRunningTime="2026-03-17 03:03:42.333490552 +0000 UTC m=+6847.965723530" watchObservedRunningTime="2026-03-17 03:03:42.343512533 +0000 UTC m=+6847.975745501" Mar 17 03:03:47 crc kubenswrapper[4735]: I0317 03:03:47.193935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:47 crc kubenswrapper[4735]: I0317 03:03:47.194492 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:47 crc kubenswrapper[4735]: I0317 03:03:47.250635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:47 crc kubenswrapper[4735]: I0317 03:03:47.409434 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:47 crc kubenswrapper[4735]: I0317 03:03:47.501160 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:49 crc kubenswrapper[4735]: I0317 03:03:49.383261 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2rcz9" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="registry-server" containerID="cri-o://0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d" gracePeriod=2 Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.041105 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.207887 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities\") pod \"1293c721-9574-406d-900b-44cd49edd970\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.208292 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fjf\" (UniqueName: \"kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf\") pod \"1293c721-9574-406d-900b-44cd49edd970\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.208544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities" (OuterVolumeSpecName: "utilities") pod "1293c721-9574-406d-900b-44cd49edd970" (UID: "1293c721-9574-406d-900b-44cd49edd970"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.208891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content\") pod \"1293c721-9574-406d-900b-44cd49edd970\" (UID: \"1293c721-9574-406d-900b-44cd49edd970\") " Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.212728 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf" (OuterVolumeSpecName: "kube-api-access-79fjf") pod "1293c721-9574-406d-900b-44cd49edd970" (UID: "1293c721-9574-406d-900b-44cd49edd970"). InnerVolumeSpecName "kube-api-access-79fjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.224826 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fjf\" (UniqueName: \"kubernetes.io/projected/1293c721-9574-406d-900b-44cd49edd970-kube-api-access-79fjf\") on node \"crc\" DevicePath \"\"" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.224849 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.252224 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1293c721-9574-406d-900b-44cd49edd970" (UID: "1293c721-9574-406d-900b-44cd49edd970"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.326946 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1293c721-9574-406d-900b-44cd49edd970-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.396424 4735 generic.go:334] "Generic (PLEG): container finished" podID="1293c721-9574-406d-900b-44cd49edd970" containerID="0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d" exitCode=0 Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.396484 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerDied","Data":"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d"} Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.396522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rcz9" event={"ID":"1293c721-9574-406d-900b-44cd49edd970","Type":"ContainerDied","Data":"b03fb40bf679270d4f303f656d38353c4d2188323d9bc594767a6a310edd8eab"} Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.396550 4735 scope.go:117] "RemoveContainer" containerID="0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.396720 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rcz9" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.439518 4735 scope.go:117] "RemoveContainer" containerID="bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.470287 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.480904 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rcz9"] Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.494796 4735 scope.go:117] "RemoveContainer" containerID="0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.541168 4735 scope.go:117] "RemoveContainer" containerID="0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d" Mar 17 03:03:50 crc kubenswrapper[4735]: E0317 03:03:50.541878 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d\": container with ID starting with 0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d not found: ID does not exist" containerID="0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.541922 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d"} err="failed to get container status \"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d\": rpc error: code = NotFound desc = could not find container \"0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d\": container with ID starting with 0eebd39c3a70ef89e84a3eed12ef42f82b214e71cac14813dce08d29a9ce0b4d not found: ID does not exist" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.541947 4735 scope.go:117] "RemoveContainer" containerID="bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62" Mar 17 03:03:50 crc kubenswrapper[4735]: E0317 03:03:50.542356 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62\": container with ID starting with bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62 not found: ID does not exist" containerID="bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.542443 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62"} err="failed to get container status \"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62\": rpc error: code = NotFound desc = could not find container \"bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62\": container with ID starting with bbb3cdb0e63e13d6e307403ae4ff6e8fa22280a770e22429bf10e720d0402e62 not found: ID does not exist" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.542512 4735 scope.go:117] "RemoveContainer" containerID="0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01" Mar 17 03:03:50 crc kubenswrapper[4735]: E0317 03:03:50.542943 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01\": container with ID starting with 0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01 not found: ID does not exist" containerID="0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01" Mar 17 03:03:50 crc kubenswrapper[4735]: I0317 03:03:50.542971 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01"} err="failed to get container status \"0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01\": rpc error: code = NotFound desc = could not find container \"0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01\": container with ID starting with 0d73f19e5bd33f0afc8fbfd94dca8b5614765b02ef7bda338e4746944078ec01 not found: ID does not exist" Mar 17 03:03:51 crc kubenswrapper[4735]: I0317 03:03:51.089185 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1293c721-9574-406d-900b-44cd49edd970" path="/var/lib/kubelet/pods/1293c721-9574-406d-900b-44cd49edd970/volumes" Mar 17 03:03:52 crc kubenswrapper[4735]: I0317 03:03:52.074333 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:03:52 crc kubenswrapper[4735]: I0317 03:03:52.417786 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f"} Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.143468 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561944-lknlf"] Mar 17 03:04:00 crc kubenswrapper[4735]: E0317 03:04:00.144412 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="extract-content" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.144424 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="extract-content" Mar 17 03:04:00 crc kubenswrapper[4735]: E0317 03:04:00.144435 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="registry-server" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.144441 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="registry-server" Mar 17 03:04:00 crc kubenswrapper[4735]: E0317 03:04:00.144491 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="extract-utilities" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.144498 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="extract-utilities" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.144672 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293c721-9574-406d-900b-44cd49edd970" containerName="registry-server" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.145327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.155214 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.155212 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.160490 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.161718 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561944-lknlf"] Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.172391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qcbb\" (UniqueName: \"kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb\") pod \"auto-csr-approver-29561944-lknlf\" (UID: \"70039033-a219-4b40-a247-c3b668ab9404\") " pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.273741 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qcbb\" (UniqueName: \"kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb\") pod \"auto-csr-approver-29561944-lknlf\" (UID: \"70039033-a219-4b40-a247-c3b668ab9404\") " pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.294377 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qcbb\" (UniqueName: \"kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb\") pod \"auto-csr-approver-29561944-lknlf\" (UID: \"70039033-a219-4b40-a247-c3b668ab9404\") " pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.472945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:00 crc kubenswrapper[4735]: I0317 03:04:00.964523 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561944-lknlf"] Mar 17 03:04:01 crc kubenswrapper[4735]: I0317 03:04:01.527021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561944-lknlf" event={"ID":"70039033-a219-4b40-a247-c3b668ab9404","Type":"ContainerStarted","Data":"2e66b6a5c1fb4bbaa9467ec96fc9ac51213866f294d3dd0cc0ae16b7b7c51ada"} Mar 17 03:04:02 crc kubenswrapper[4735]: I0317 03:04:02.538488 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561944-lknlf" event={"ID":"70039033-a219-4b40-a247-c3b668ab9404","Type":"ContainerStarted","Data":"b570b845c2499be59d8b638871a40ef1a9127d258927789f4a5b5a3bedcf1400"} Mar 17 03:04:02 crc kubenswrapper[4735]: I0317 03:04:02.566834 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561944-lknlf" podStartSLOduration=1.450752979 podStartE2EDuration="2.566808301s" podCreationTimestamp="2026-03-17 03:04:00 +0000 UTC" firstStartedPulling="2026-03-17 03:04:00.951322014 +0000 UTC m=+6866.583554992" lastFinishedPulling="2026-03-17 03:04:02.067377306 +0000 UTC m=+6867.699610314" observedRunningTime="2026-03-17 03:04:02.559996548 +0000 UTC m=+6868.192229526" watchObservedRunningTime="2026-03-17 03:04:02.566808301 +0000 UTC m=+6868.199041289" Mar 17 03:04:03 crc kubenswrapper[4735]: I0317 03:04:03.548638 4735 generic.go:334] "Generic (PLEG): container finished" podID="70039033-a219-4b40-a247-c3b668ab9404" containerID="b570b845c2499be59d8b638871a40ef1a9127d258927789f4a5b5a3bedcf1400" exitCode=0 Mar 17 03:04:03 crc kubenswrapper[4735]: I0317 03:04:03.548682 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561944-lknlf" event={"ID":"70039033-a219-4b40-a247-c3b668ab9404","Type":"ContainerDied","Data":"b570b845c2499be59d8b638871a40ef1a9127d258927789f4a5b5a3bedcf1400"} Mar 17 03:04:04 crc kubenswrapper[4735]: I0317 03:04:04.986125 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.091880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qcbb\" (UniqueName: \"kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb\") pod \"70039033-a219-4b40-a247-c3b668ab9404\" (UID: \"70039033-a219-4b40-a247-c3b668ab9404\") " Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.099726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb" (OuterVolumeSpecName: "kube-api-access-8qcbb") pod "70039033-a219-4b40-a247-c3b668ab9404" (UID: "70039033-a219-4b40-a247-c3b668ab9404"). InnerVolumeSpecName "kube-api-access-8qcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.196468 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qcbb\" (UniqueName: \"kubernetes.io/projected/70039033-a219-4b40-a247-c3b668ab9404-kube-api-access-8qcbb\") on node \"crc\" DevicePath \"\"" Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.572606 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561944-lknlf" event={"ID":"70039033-a219-4b40-a247-c3b668ab9404","Type":"ContainerDied","Data":"2e66b6a5c1fb4bbaa9467ec96fc9ac51213866f294d3dd0cc0ae16b7b7c51ada"} Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.573044 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561944-lknlf" Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.573191 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e66b6a5c1fb4bbaa9467ec96fc9ac51213866f294d3dd0cc0ae16b7b7c51ada" Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.661357 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561938-22zck"] Mar 17 03:04:05 crc kubenswrapper[4735]: I0317 03:04:05.674823 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561938-22zck"] Mar 17 03:04:07 crc kubenswrapper[4735]: I0317 03:04:07.083261 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79373054-b0ae-47ba-a0f4-eb65a20d565b" path="/var/lib/kubelet/pods/79373054-b0ae-47ba-a0f4-eb65a20d565b/volumes" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.703426 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:04:50 crc kubenswrapper[4735]: E0317 03:04:50.704605 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70039033-a219-4b40-a247-c3b668ab9404" containerName="oc" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.704627 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="70039033-a219-4b40-a247-c3b668ab9404" containerName="oc" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.704998 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="70039033-a219-4b40-a247-c3b668ab9404" containerName="oc" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.707334 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.730121 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.838458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.838503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.838612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78kz\" (UniqueName: \"kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.939901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.939942 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.940010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w78kz\" (UniqueName: \"kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.940950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.941232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:50 crc kubenswrapper[4735]: I0317 03:04:50.976616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78kz\" (UniqueName: \"kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz\") pod \"certified-operators-2z55m\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:51 crc kubenswrapper[4735]: I0317 03:04:51.035962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:04:51 crc kubenswrapper[4735]: I0317 03:04:51.640105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:04:52 crc kubenswrapper[4735]: I0317 03:04:52.049149 4735 generic.go:334] "Generic (PLEG): container finished" podID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerID="81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6" exitCode=0 Mar 17 03:04:52 crc kubenswrapper[4735]: I0317 03:04:52.049226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerDied","Data":"81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6"} Mar 17 03:04:52 crc kubenswrapper[4735]: I0317 03:04:52.049508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerStarted","Data":"00271ad1dc5521bdb7b5f1c5da9bb96b6724edce75b04448a3900a2e9d2b0c80"} Mar 17 03:04:54 crc kubenswrapper[4735]: I0317 03:04:54.084120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerStarted","Data":"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161"} Mar 17 03:04:55 crc kubenswrapper[4735]: I0317 03:04:55.101563 4735 generic.go:334] "Generic (PLEG): container finished" podID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerID="8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161" exitCode=0 Mar 17 03:04:55 crc kubenswrapper[4735]: I0317 03:04:55.103468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerDied","Data":"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161"} Mar 17 03:04:56 crc kubenswrapper[4735]: I0317 03:04:56.135443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerStarted","Data":"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23"} Mar 17 03:04:56 crc kubenswrapper[4735]: I0317 03:04:56.161620 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2z55m" podStartSLOduration=2.674686362 podStartE2EDuration="6.161597845s" podCreationTimestamp="2026-03-17 03:04:50 +0000 UTC" firstStartedPulling="2026-03-17 03:04:52.05281992 +0000 UTC m=+6917.685052898" lastFinishedPulling="2026-03-17 03:04:55.539731393 +0000 UTC m=+6921.171964381" observedRunningTime="2026-03-17 03:04:56.157517026 +0000 UTC m=+6921.789750044" watchObservedRunningTime="2026-03-17 03:04:56.161597845 +0000 UTC m=+6921.793830833" Mar 17 03:05:01 crc kubenswrapper[4735]: I0317 03:05:01.037113 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:01 crc kubenswrapper[4735]: I0317 03:05:01.037666 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:02 crc kubenswrapper[4735]: I0317 03:05:02.096979 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2z55m" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="registry-server" probeResult="failure" output=< Mar 17 03:05:02 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:05:02 crc kubenswrapper[4735]: > Mar 17 03:05:05 crc kubenswrapper[4735]: I0317 03:05:05.350014 4735 scope.go:117] "RemoveContainer" containerID="32d77d9358c08cafc9f37a38894aa4dbb1094e36718900c2dc08dcdb2427e903" Mar 17 03:05:11 crc kubenswrapper[4735]: I0317 03:05:11.087130 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:11 crc kubenswrapper[4735]: I0317 03:05:11.140608 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:11 crc kubenswrapper[4735]: I0317 03:05:11.330826 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:05:12 crc kubenswrapper[4735]: I0317 03:05:12.299833 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2z55m" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="registry-server" containerID="cri-o://b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23" gracePeriod=2 Mar 17 03:05:12 crc kubenswrapper[4735]: I0317 03:05:12.951660 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.103661 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w78kz\" (UniqueName: \"kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz\") pod \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.103771 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities\") pod \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.103881 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content\") pod \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\" (UID: \"a5b4ca52-2f54-4842-a082-27cb4edaa3e6\") " Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.104593 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities" (OuterVolumeSpecName: "utilities") pod "a5b4ca52-2f54-4842-a082-27cb4edaa3e6" (UID: "a5b4ca52-2f54-4842-a082-27cb4edaa3e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.119781 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz" (OuterVolumeSpecName: "kube-api-access-w78kz") pod "a5b4ca52-2f54-4842-a082-27cb4edaa3e6" (UID: "a5b4ca52-2f54-4842-a082-27cb4edaa3e6"). InnerVolumeSpecName "kube-api-access-w78kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.163237 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5b4ca52-2f54-4842-a082-27cb4edaa3e6" (UID: "a5b4ca52-2f54-4842-a082-27cb4edaa3e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.206404 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.206452 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.206472 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w78kz\" (UniqueName: \"kubernetes.io/projected/a5b4ca52-2f54-4842-a082-27cb4edaa3e6-kube-api-access-w78kz\") on node \"crc\" DevicePath \"\"" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.310542 4735 generic.go:334] "Generic (PLEG): container finished" podID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerID="b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23" exitCode=0 Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.310581 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z55m" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.310614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerDied","Data":"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23"} Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.310669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z55m" event={"ID":"a5b4ca52-2f54-4842-a082-27cb4edaa3e6","Type":"ContainerDied","Data":"00271ad1dc5521bdb7b5f1c5da9bb96b6724edce75b04448a3900a2e9d2b0c80"} Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.310694 4735 scope.go:117] "RemoveContainer" containerID="b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.338883 4735 scope.go:117] "RemoveContainer" containerID="8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.360049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.368767 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2z55m"] Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.384579 4735 scope.go:117] "RemoveContainer" containerID="81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.428353 4735 scope.go:117] "RemoveContainer" containerID="b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23" Mar 17 03:05:13 crc kubenswrapper[4735]: E0317 03:05:13.428998 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23\": container with ID starting with b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23 not found: ID does not exist" containerID="b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.429041 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23"} err="failed to get container status \"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23\": rpc error: code = NotFound desc = could not find container \"b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23\": container with ID starting with b2670f9a99ec4cb4923ad2077d282ab94aceb51e00be57efc8d3e4d89583ef23 not found: ID does not exist" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.429064 4735 scope.go:117] "RemoveContainer" containerID="8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161" Mar 17 03:05:13 crc kubenswrapper[4735]: E0317 03:05:13.429484 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161\": container with ID starting with 8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161 not found: ID does not exist" containerID="8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.429526 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161"} err="failed to get container status \"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161\": rpc error: code = NotFound desc = could not find container \"8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161\": container with ID starting with 8209d51328b51e51a22e58a1a434602efcc8b45fa6362f5a05aa272aa6b92161 not found: ID does not exist" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.429549 4735 scope.go:117] "RemoveContainer" containerID="81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6" Mar 17 03:05:13 crc kubenswrapper[4735]: E0317 03:05:13.429876 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6\": container with ID starting with 81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6 not found: ID does not exist" containerID="81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6" Mar 17 03:05:13 crc kubenswrapper[4735]: I0317 03:05:13.429973 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6"} err="failed to get container status \"81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6\": rpc error: code = NotFound desc = could not find container \"81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6\": container with ID starting with 81339102aed21ff2666eb675eaae3c97473403e6cb2ba6074126ae7635a5e5c6 not found: ID does not exist" Mar 17 03:05:15 crc kubenswrapper[4735]: I0317 03:05:15.092182 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" path="/var/lib/kubelet/pods/a5b4ca52-2f54-4842-a082-27cb4edaa3e6/volumes" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.168132 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561946-ndnfg"] Mar 17 03:06:00 crc kubenswrapper[4735]: E0317 03:06:00.169017 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="extract-utilities" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.169031 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="extract-utilities" Mar 17 03:06:00 crc kubenswrapper[4735]: E0317 03:06:00.169075 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="extract-content" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.169081 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="extract-content" Mar 17 03:06:00 crc kubenswrapper[4735]: E0317 03:06:00.169092 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="registry-server" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.169097 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="registry-server" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.169257 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b4ca52-2f54-4842-a082-27cb4edaa3e6" containerName="registry-server" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.169901 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.175584 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.175686 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.175596 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.188429 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561946-ndnfg"] Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.320366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qbds\" (UniqueName: \"kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds\") pod \"auto-csr-approver-29561946-ndnfg\" (UID: \"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6\") " pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.423070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qbds\" (UniqueName: \"kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds\") pod \"auto-csr-approver-29561946-ndnfg\" (UID: \"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6\") " pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.446199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qbds\" (UniqueName: \"kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds\") pod \"auto-csr-approver-29561946-ndnfg\" (UID: \"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6\") " pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:00 crc kubenswrapper[4735]: I0317 03:06:00.495037 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:01 crc kubenswrapper[4735]: I0317 03:06:01.000509 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561946-ndnfg"] Mar 17 03:06:01 crc kubenswrapper[4735]: I0317 03:06:01.805069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" event={"ID":"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6","Type":"ContainerStarted","Data":"e259fcb60bbd082633aca06f5dbc457aea89a14d2df1b16701318044d70577cd"} Mar 17 03:06:02 crc kubenswrapper[4735]: I0317 03:06:02.820441 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" event={"ID":"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6","Type":"ContainerStarted","Data":"9d83dd8eadd3682de3feddd0337a273d9a550d87247d45ddf3e37cf91be76686"} Mar 17 03:06:02 crc kubenswrapper[4735]: I0317 03:06:02.850465 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" podStartSLOduration=1.93225777 podStartE2EDuration="2.850436211s" podCreationTimestamp="2026-03-17 03:06:00 +0000 UTC" firstStartedPulling="2026-03-17 03:06:01.000905462 +0000 UTC m=+6986.633138460" lastFinishedPulling="2026-03-17 03:06:01.919083923 +0000 UTC m=+6987.551316901" observedRunningTime="2026-03-17 03:06:02.835651955 +0000 UTC m=+6988.467884973" watchObservedRunningTime="2026-03-17 03:06:02.850436211 +0000 UTC m=+6988.482669219" Mar 17 03:06:03 crc kubenswrapper[4735]: I0317 03:06:03.838015 4735 generic.go:334] "Generic (PLEG): container finished" podID="bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" containerID="9d83dd8eadd3682de3feddd0337a273d9a550d87247d45ddf3e37cf91be76686" exitCode=0 Mar 17 03:06:03 crc kubenswrapper[4735]: I0317 03:06:03.838114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" event={"ID":"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6","Type":"ContainerDied","Data":"9d83dd8eadd3682de3feddd0337a273d9a550d87247d45ddf3e37cf91be76686"} Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.336534 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.528284 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qbds\" (UniqueName: \"kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds\") pod \"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6\" (UID: \"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6\") " Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.541004 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds" (OuterVolumeSpecName: "kube-api-access-6qbds") pod "bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" (UID: "bcb7fb6d-743a-422c-8d87-2ff9d93e84f6"). InnerVolumeSpecName "kube-api-access-6qbds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.632592 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qbds\" (UniqueName: \"kubernetes.io/projected/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6-kube-api-access-6qbds\") on node \"crc\" DevicePath \"\"" Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.861458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" event={"ID":"bcb7fb6d-743a-422c-8d87-2ff9d93e84f6","Type":"ContainerDied","Data":"e259fcb60bbd082633aca06f5dbc457aea89a14d2df1b16701318044d70577cd"} Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.861506 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561946-ndnfg" Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.861510 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e259fcb60bbd082633aca06f5dbc457aea89a14d2df1b16701318044d70577cd" Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.957342 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561940-b4kfz"] Mar 17 03:06:05 crc kubenswrapper[4735]: I0317 03:06:05.969586 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561940-b4kfz"] Mar 17 03:06:07 crc kubenswrapper[4735]: I0317 03:06:07.089259 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98065f64-473d-47dd-b1e4-02cdf53a3fc0" path="/var/lib/kubelet/pods/98065f64-473d-47dd-b1e4-02cdf53a3fc0/volumes" Mar 17 03:06:12 crc kubenswrapper[4735]: I0317 03:06:12.606352 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:06:12 crc kubenswrapper[4735]: I0317 03:06:12.606873 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:06:42 crc kubenswrapper[4735]: I0317 03:06:42.606034 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:06:42 crc kubenswrapper[4735]: I0317 03:06:42.606643 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:07:05 crc kubenswrapper[4735]: I0317 03:07:05.454538 4735 scope.go:117] "RemoveContainer" containerID="cf68af1a0cd5ecc74a86674359a1ee1f4d54729a7654cbc575bfc9ee5db98754" Mar 17 03:07:12 crc kubenswrapper[4735]: I0317 03:07:12.606004 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:07:12 crc kubenswrapper[4735]: I0317 03:07:12.606398 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:07:12 crc kubenswrapper[4735]: I0317 03:07:12.606442 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:07:12 crc kubenswrapper[4735]: I0317 03:07:12.607681 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:07:12 crc kubenswrapper[4735]: I0317 03:07:12.607749 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f" gracePeriod=600 Mar 17 03:07:13 crc kubenswrapper[4735]: I0317 03:07:13.561145 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f" exitCode=0 Mar 17 03:07:13 crc kubenswrapper[4735]: I0317 03:07:13.561197 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f"} Mar 17 03:07:13 crc kubenswrapper[4735]: I0317 03:07:13.561491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40"} Mar 17 03:07:13 crc kubenswrapper[4735]: I0317 03:07:13.561515 4735 scope.go:117] "RemoveContainer" containerID="f4cbdb4262717a7905b363e3babe813cfc3fb49882e315e2188959ed8b4a7891" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.164037 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561948-9sxpm"] Mar 17 03:08:00 crc kubenswrapper[4735]: E0317 03:08:00.165692 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" containerName="oc" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.165725 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" containerName="oc" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.166246 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" containerName="oc" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.167622 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.170498 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.171258 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.177514 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561948-9sxpm"] Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.180096 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.256099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st59m\" (UniqueName: \"kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m\") pod \"auto-csr-approver-29561948-9sxpm\" (UID: \"3637e311-d76f-4485-a03b-add96cb61333\") " pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.358921 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st59m\" (UniqueName: \"kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m\") pod \"auto-csr-approver-29561948-9sxpm\" (UID: \"3637e311-d76f-4485-a03b-add96cb61333\") " pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.396730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st59m\" (UniqueName: \"kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m\") pod \"auto-csr-approver-29561948-9sxpm\" (UID: \"3637e311-d76f-4485-a03b-add96cb61333\") " pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:00 crc kubenswrapper[4735]: I0317 03:08:00.489777 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:01 crc kubenswrapper[4735]: I0317 03:08:01.128818 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561948-9sxpm"] Mar 17 03:08:01 crc kubenswrapper[4735]: W0317 03:08:01.132121 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3637e311_d76f_4485_a03b_add96cb61333.slice/crio-95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575 WatchSource:0}: Error finding container 95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575: Status 404 returned error can't find the container with id 95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575 Mar 17 03:08:02 crc kubenswrapper[4735]: I0317 03:08:02.096964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" event={"ID":"3637e311-d76f-4485-a03b-add96cb61333","Type":"ContainerStarted","Data":"95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575"} Mar 17 03:08:04 crc kubenswrapper[4735]: I0317 03:08:04.120234 4735 generic.go:334] "Generic (PLEG): container finished" podID="3637e311-d76f-4485-a03b-add96cb61333" containerID="2a84b0e631cac7bfc5da742b08be32432e3735e53c23c0e186db6266b057575d" exitCode=0 Mar 17 03:08:04 crc kubenswrapper[4735]: I0317 03:08:04.120282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" event={"ID":"3637e311-d76f-4485-a03b-add96cb61333","Type":"ContainerDied","Data":"2a84b0e631cac7bfc5da742b08be32432e3735e53c23c0e186db6266b057575d"} Mar 17 03:08:05 crc kubenswrapper[4735]: I0317 03:08:05.516850 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:05 crc kubenswrapper[4735]: I0317 03:08:05.593452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st59m\" (UniqueName: \"kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m\") pod \"3637e311-d76f-4485-a03b-add96cb61333\" (UID: \"3637e311-d76f-4485-a03b-add96cb61333\") " Mar 17 03:08:05 crc kubenswrapper[4735]: I0317 03:08:05.600737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m" (OuterVolumeSpecName: "kube-api-access-st59m") pod "3637e311-d76f-4485-a03b-add96cb61333" (UID: "3637e311-d76f-4485-a03b-add96cb61333"). InnerVolumeSpecName "kube-api-access-st59m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:08:05 crc kubenswrapper[4735]: I0317 03:08:05.696436 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st59m\" (UniqueName: \"kubernetes.io/projected/3637e311-d76f-4485-a03b-add96cb61333-kube-api-access-st59m\") on node \"crc\" DevicePath \"\"" Mar 17 03:08:06 crc kubenswrapper[4735]: I0317 03:08:06.140264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" event={"ID":"3637e311-d76f-4485-a03b-add96cb61333","Type":"ContainerDied","Data":"95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575"} Mar 17 03:08:06 crc kubenswrapper[4735]: I0317 03:08:06.140668 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d12c86aa249bec67893b99a832aeae90a85020cc92eeb776953e7a6193f575" Mar 17 03:08:06 crc kubenswrapper[4735]: I0317 03:08:06.140466 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561948-9sxpm" Mar 17 03:08:06 crc kubenswrapper[4735]: I0317 03:08:06.591310 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561942-5wgmt"] Mar 17 03:08:06 crc kubenswrapper[4735]: I0317 03:08:06.599979 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561942-5wgmt"] Mar 17 03:08:07 crc kubenswrapper[4735]: I0317 03:08:07.084632 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5617a772-5a13-4cea-a8d0-670299f82028" path="/var/lib/kubelet/pods/5617a772-5a13-4cea-a8d0-670299f82028/volumes" Mar 17 03:09:05 crc kubenswrapper[4735]: I0317 03:09:05.563946 4735 scope.go:117] "RemoveContainer" containerID="e8def1ca637278f228ab727aeaa4534fb6ffee2fc9e132729e461b28eee25b70" Mar 17 03:09:12 crc kubenswrapper[4735]: I0317 03:09:12.606544 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:09:12 crc kubenswrapper[4735]: I0317 03:09:12.607421 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.701638 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:09:13 crc kubenswrapper[4735]: E0317 03:09:13.702321 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3637e311-d76f-4485-a03b-add96cb61333" containerName="oc" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.702356 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3637e311-d76f-4485-a03b-add96cb61333" containerName="oc" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.702653 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3637e311-d76f-4485-a03b-add96cb61333" containerName="oc" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.704701 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.719566 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.780716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spm9\" (UniqueName: \"kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.780775 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.780843 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.882661 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spm9\" (UniqueName: \"kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.882711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.882761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.883210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.883340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:13 crc kubenswrapper[4735]: I0317 03:09:13.906514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spm9\" (UniqueName: \"kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9\") pod \"redhat-operators-pgg5q\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:14 crc kubenswrapper[4735]: I0317 03:09:14.031348 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:14 crc kubenswrapper[4735]: I0317 03:09:14.765545 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:09:14 crc kubenswrapper[4735]: I0317 03:09:14.828778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerStarted","Data":"39cd50737bb293499302078c9c1b1eaa65b499d6b9f0b1ad545b025388943170"} Mar 17 03:09:15 crc kubenswrapper[4735]: I0317 03:09:15.839422 4735 generic.go:334] "Generic (PLEG): container finished" podID="6d5d593d-2b58-4752-add5-d32e669a848b" containerID="b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e" exitCode=0 Mar 17 03:09:15 crc kubenswrapper[4735]: I0317 03:09:15.839486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerDied","Data":"b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e"} Mar 17 03:09:15 crc kubenswrapper[4735]: I0317 03:09:15.842496 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:09:16 crc kubenswrapper[4735]: I0317 03:09:16.853022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerStarted","Data":"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44"} Mar 17 03:09:21 crc kubenswrapper[4735]: I0317 03:09:21.914345 4735 generic.go:334] "Generic (PLEG): container finished" podID="6d5d593d-2b58-4752-add5-d32e669a848b" containerID="8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44" exitCode=0 Mar 17 03:09:21 crc kubenswrapper[4735]: I0317 03:09:21.915739 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerDied","Data":"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44"} Mar 17 03:09:22 crc kubenswrapper[4735]: I0317 03:09:22.928390 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerStarted","Data":"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e"} Mar 17 03:09:22 crc kubenswrapper[4735]: I0317 03:09:22.958482 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgg5q" podStartSLOduration=3.470965574 podStartE2EDuration="9.958458819s" podCreationTimestamp="2026-03-17 03:09:13 +0000 UTC" firstStartedPulling="2026-03-17 03:09:15.842099124 +0000 UTC m=+7181.474332102" lastFinishedPulling="2026-03-17 03:09:22.329592369 +0000 UTC m=+7187.961825347" observedRunningTime="2026-03-17 03:09:22.947037334 +0000 UTC m=+7188.579270322" watchObservedRunningTime="2026-03-17 03:09:22.958458819 +0000 UTC m=+7188.590691807" Mar 17 03:09:24 crc kubenswrapper[4735]: I0317 03:09:24.031503 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:24 crc kubenswrapper[4735]: I0317 03:09:24.032665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:09:25 crc kubenswrapper[4735]: I0317 03:09:25.093071 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgg5q" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" probeResult="failure" output=< Mar 17 03:09:25 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:09:25 crc kubenswrapper[4735]: > Mar 17 03:09:35 crc kubenswrapper[4735]: I0317 03:09:35.108776 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgg5q" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" probeResult="failure" output=< Mar 17 03:09:35 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:09:35 crc kubenswrapper[4735]: > Mar 17 03:09:42 crc kubenswrapper[4735]: I0317 03:09:42.606546 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:09:42 crc kubenswrapper[4735]: I0317 03:09:42.607223 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:09:45 crc kubenswrapper[4735]: I0317 03:09:45.081330 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgg5q" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" probeResult="failure" output=< Mar 17 03:09:45 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:09:45 crc kubenswrapper[4735]: > Mar 17 03:09:55 crc kubenswrapper[4735]: I0317 03:09:55.106106 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgg5q" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" probeResult="failure" output=< Mar 17 03:09:55 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:09:55 crc kubenswrapper[4735]: > Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.190150 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561950-9cn7g"] Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.195780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.204800 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.205481 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.211896 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.212551 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561950-9cn7g"] Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.308031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtzcg\" (UniqueName: \"kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg\") pod \"auto-csr-approver-29561950-9cn7g\" (UID: \"d2d67eba-b70c-45df-8ca8-1c9950934639\") " pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.410125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtzcg\" (UniqueName: \"kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg\") pod \"auto-csr-approver-29561950-9cn7g\" (UID: \"d2d67eba-b70c-45df-8ca8-1c9950934639\") " pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.448923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtzcg\" (UniqueName: \"kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg\") pod \"auto-csr-approver-29561950-9cn7g\" (UID: \"d2d67eba-b70c-45df-8ca8-1c9950934639\") " pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:00 crc kubenswrapper[4735]: I0317 03:10:00.520969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:01 crc kubenswrapper[4735]: I0317 03:10:01.385831 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561950-9cn7g"] Mar 17 03:10:02 crc kubenswrapper[4735]: I0317 03:10:02.296795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" event={"ID":"d2d67eba-b70c-45df-8ca8-1c9950934639","Type":"ContainerStarted","Data":"efea1b4b26698bcd8c6c5f9f391a95c4ca9405dbebf2d69e9f4ba315801670eb"} Mar 17 03:10:04 crc kubenswrapper[4735]: I0317 03:10:04.103573 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:10:04 crc kubenswrapper[4735]: I0317 03:10:04.185922 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:10:04 crc kubenswrapper[4735]: I0317 03:10:04.312676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" event={"ID":"d2d67eba-b70c-45df-8ca8-1c9950934639","Type":"ContainerStarted","Data":"7efce6fc910680bf69ff3239f4cc23fbdc2c61a9c7cee3acba28e9732d80805b"} Mar 17 03:10:04 crc kubenswrapper[4735]: I0317 03:10:04.345968 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" podStartSLOduration=3.159191729 podStartE2EDuration="4.345947542s" podCreationTimestamp="2026-03-17 03:10:00 +0000 UTC" firstStartedPulling="2026-03-17 03:10:01.405942638 +0000 UTC m=+7227.038175616" lastFinishedPulling="2026-03-17 03:10:02.592698461 +0000 UTC m=+7228.224931429" observedRunningTime="2026-03-17 03:10:04.333699048 +0000 UTC m=+7229.965932026" watchObservedRunningTime="2026-03-17 03:10:04.345947542 +0000 UTC m=+7229.978180540" Mar 17 03:10:04 crc kubenswrapper[4735]: I0317 03:10:04.359596 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:10:05 crc kubenswrapper[4735]: I0317 03:10:05.328650 4735 generic.go:334] "Generic (PLEG): container finished" podID="d2d67eba-b70c-45df-8ca8-1c9950934639" containerID="7efce6fc910680bf69ff3239f4cc23fbdc2c61a9c7cee3acba28e9732d80805b" exitCode=0 Mar 17 03:10:05 crc kubenswrapper[4735]: I0317 03:10:05.328784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" event={"ID":"d2d67eba-b70c-45df-8ca8-1c9950934639","Type":"ContainerDied","Data":"7efce6fc910680bf69ff3239f4cc23fbdc2c61a9c7cee3acba28e9732d80805b"} Mar 17 03:10:05 crc kubenswrapper[4735]: I0317 03:10:05.329362 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pgg5q" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" containerID="cri-o://56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e" gracePeriod=2 Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.055818 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.215422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5spm9\" (UniqueName: \"kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9\") pod \"6d5d593d-2b58-4752-add5-d32e669a848b\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.215558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content\") pod \"6d5d593d-2b58-4752-add5-d32e669a848b\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.215664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities\") pod \"6d5d593d-2b58-4752-add5-d32e669a848b\" (UID: \"6d5d593d-2b58-4752-add5-d32e669a848b\") " Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.216331 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities" (OuterVolumeSpecName: "utilities") pod "6d5d593d-2b58-4752-add5-d32e669a848b" (UID: "6d5d593d-2b58-4752-add5-d32e669a848b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.228079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9" (OuterVolumeSpecName: "kube-api-access-5spm9") pod "6d5d593d-2b58-4752-add5-d32e669a848b" (UID: "6d5d593d-2b58-4752-add5-d32e669a848b"). InnerVolumeSpecName "kube-api-access-5spm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.324442 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.324484 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5spm9\" (UniqueName: \"kubernetes.io/projected/6d5d593d-2b58-4752-add5-d32e669a848b-kube-api-access-5spm9\") on node \"crc\" DevicePath \"\"" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.340841 4735 generic.go:334] "Generic (PLEG): container finished" podID="6d5d593d-2b58-4752-add5-d32e669a848b" containerID="56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e" exitCode=0 Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.340907 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerDied","Data":"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e"} Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.340959 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgg5q" event={"ID":"6d5d593d-2b58-4752-add5-d32e669a848b","Type":"ContainerDied","Data":"39cd50737bb293499302078c9c1b1eaa65b499d6b9f0b1ad545b025388943170"} Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.340964 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgg5q" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.340988 4735 scope.go:117] "RemoveContainer" containerID="56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.375787 4735 scope.go:117] "RemoveContainer" containerID="8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.382830 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d5d593d-2b58-4752-add5-d32e669a848b" (UID: "6d5d593d-2b58-4752-add5-d32e669a848b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.399489 4735 scope.go:117] "RemoveContainer" containerID="b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.425981 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5d593d-2b58-4752-add5-d32e669a848b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.448935 4735 scope.go:117] "RemoveContainer" containerID="56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e" Mar 17 03:10:06 crc kubenswrapper[4735]: E0317 03:10:06.449373 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e\": container with ID starting with 56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e not found: ID does not exist" containerID="56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.449405 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e"} err="failed to get container status \"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e\": rpc error: code = NotFound desc = could not find container \"56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e\": container with ID starting with 56f65c609ec085d3a6a8462871f0ab07ad2a6c67d402da80af5c1fa6df44fb0e not found: ID does not exist" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.449427 4735 scope.go:117] "RemoveContainer" containerID="8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44" Mar 17 03:10:06 crc kubenswrapper[4735]: E0317 03:10:06.450709 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44\": container with ID starting with 8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44 not found: ID does not exist" containerID="8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.450735 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44"} err="failed to get container status \"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44\": rpc error: code = NotFound desc = could not find container \"8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44\": container with ID starting with 8eceb573d5f54c76733a38c8c1cdd2ece1028cdd0619ce75ca31a8d9e7485c44 not found: ID does not exist" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.450751 4735 scope.go:117] "RemoveContainer" containerID="b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e" Mar 17 03:10:06 crc kubenswrapper[4735]: E0317 03:10:06.451144 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e\": container with ID starting with b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e not found: ID does not exist" containerID="b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.451166 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e"} err="failed to get container status \"b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e\": rpc error: code = NotFound desc = could not find container \"b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e\": container with ID starting with b5a43c50bcd2bc0c9875f0bb924dad35ca092608cd5cbcf8f74fe0cc217d279e not found: ID does not exist" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.672174 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.684458 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pgg5q"] Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.812984 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.934120 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtzcg\" (UniqueName: \"kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg\") pod \"d2d67eba-b70c-45df-8ca8-1c9950934639\" (UID: \"d2d67eba-b70c-45df-8ca8-1c9950934639\") " Mar 17 03:10:06 crc kubenswrapper[4735]: I0317 03:10:06.939613 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg" (OuterVolumeSpecName: "kube-api-access-xtzcg") pod "d2d67eba-b70c-45df-8ca8-1c9950934639" (UID: "d2d67eba-b70c-45df-8ca8-1c9950934639"). InnerVolumeSpecName "kube-api-access-xtzcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.037414 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtzcg\" (UniqueName: \"kubernetes.io/projected/d2d67eba-b70c-45df-8ca8-1c9950934639-kube-api-access-xtzcg\") on node \"crc\" DevicePath \"\"" Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.089957 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" path="/var/lib/kubelet/pods/6d5d593d-2b58-4752-add5-d32e669a848b/volumes" Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.356783 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" event={"ID":"d2d67eba-b70c-45df-8ca8-1c9950934639","Type":"ContainerDied","Data":"efea1b4b26698bcd8c6c5f9f391a95c4ca9405dbebf2d69e9f4ba315801670eb"} Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.356823 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efea1b4b26698bcd8c6c5f9f391a95c4ca9405dbebf2d69e9f4ba315801670eb" Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.356867 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561950-9cn7g" Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.420959 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561944-lknlf"] Mar 17 03:10:07 crc kubenswrapper[4735]: I0317 03:10:07.430875 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561944-lknlf"] Mar 17 03:10:09 crc kubenswrapper[4735]: I0317 03:10:09.087026 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70039033-a219-4b40-a247-c3b668ab9404" path="/var/lib/kubelet/pods/70039033-a219-4b40-a247-c3b668ab9404/volumes" Mar 17 03:10:12 crc kubenswrapper[4735]: I0317 03:10:12.607071 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:10:12 crc kubenswrapper[4735]: I0317 03:10:12.608031 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:10:12 crc kubenswrapper[4735]: I0317 03:10:12.608110 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:10:12 crc kubenswrapper[4735]: I0317 03:10:12.609355 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:10:12 crc kubenswrapper[4735]: I0317 03:10:12.609493 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" gracePeriod=600 Mar 17 03:10:12 crc kubenswrapper[4735]: E0317 03:10:12.736182 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:10:13 crc kubenswrapper[4735]: I0317 03:10:13.415840 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" exitCode=0 Mar 17 03:10:13 crc kubenswrapper[4735]: I0317 03:10:13.416199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40"} Mar 17 03:10:13 crc kubenswrapper[4735]: I0317 03:10:13.416249 4735 scope.go:117] "RemoveContainer" containerID="b0e4629fb1d1a1d92d5ca81b95949500fbbb3e9432ec413c34d1a632ff7dcf1f" Mar 17 03:10:13 crc kubenswrapper[4735]: I0317 03:10:13.417422 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:10:13 crc kubenswrapper[4735]: E0317 03:10:13.417975 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:10:21 crc kubenswrapper[4735]: E0317 03:10:21.966588 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:41134->38.102.83.65:40841: write tcp 38.102.83.65:41134->38.102.83.65:40841: write: broken pipe Mar 17 03:10:28 crc kubenswrapper[4735]: I0317 03:10:28.073521 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:10:28 crc kubenswrapper[4735]: E0317 03:10:28.074395 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:10:41 crc kubenswrapper[4735]: I0317 03:10:41.074954 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:10:41 crc kubenswrapper[4735]: E0317 03:10:41.075759 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:10:52 crc kubenswrapper[4735]: I0317 03:10:52.074747 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:10:52 crc kubenswrapper[4735]: E0317 03:10:52.076092 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:11:05 crc kubenswrapper[4735]: I0317 03:11:05.081305 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:11:05 crc kubenswrapper[4735]: E0317 03:11:05.082244 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:11:05 crc kubenswrapper[4735]: I0317 03:11:05.657132 4735 scope.go:117] "RemoveContainer" containerID="b570b845c2499be59d8b638871a40ef1a9127d258927789f4a5b5a3bedcf1400" Mar 17 03:11:16 crc kubenswrapper[4735]: I0317 03:11:16.073538 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:11:16 crc kubenswrapper[4735]: E0317 03:11:16.076078 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:11:27 crc kubenswrapper[4735]: E0317 03:11:27.172284 4735 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.65:54796->38.102.83.65:40841: write tcp 192.168.126.11:10250->192.168.126.11:46112: write: connection reset by peer Mar 17 03:11:30 crc kubenswrapper[4735]: I0317 03:11:30.073159 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:11:30 crc kubenswrapper[4735]: E0317 03:11:30.073986 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:11:43 crc kubenswrapper[4735]: I0317 03:11:43.073951 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:11:43 crc kubenswrapper[4735]: E0317 03:11:43.074653 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:11:54 crc kubenswrapper[4735]: I0317 03:11:54.073119 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:11:54 crc kubenswrapper[4735]: E0317 03:11:54.074019 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.148278 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561952-dcxvk"] Mar 17 03:12:00 crc kubenswrapper[4735]: E0317 03:12:00.149200 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149214 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" Mar 17 03:12:00 crc kubenswrapper[4735]: E0317 03:12:00.149238 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d67eba-b70c-45df-8ca8-1c9950934639" containerName="oc" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149244 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d67eba-b70c-45df-8ca8-1c9950934639" containerName="oc" Mar 17 03:12:00 crc kubenswrapper[4735]: E0317 03:12:00.149250 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="extract-utilities" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149257 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="extract-utilities" Mar 17 03:12:00 crc kubenswrapper[4735]: E0317 03:12:00.149274 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="extract-content" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149281 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="extract-content" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149465 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d67eba-b70c-45df-8ca8-1c9950934639" containerName="oc" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.149479 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5d593d-2b58-4752-add5-d32e669a848b" containerName="registry-server" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.150101 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.152622 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.153914 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.154543 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.161426 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561952-dcxvk"] Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.249120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs5cx\" (UniqueName: \"kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx\") pod \"auto-csr-approver-29561952-dcxvk\" (UID: \"2a04dd83-8815-41ec-b2d2-eece5171d802\") " pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.350679 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs5cx\" (UniqueName: \"kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx\") pod \"auto-csr-approver-29561952-dcxvk\" (UID: \"2a04dd83-8815-41ec-b2d2-eece5171d802\") " pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.376189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs5cx\" (UniqueName: \"kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx\") pod \"auto-csr-approver-29561952-dcxvk\" (UID: \"2a04dd83-8815-41ec-b2d2-eece5171d802\") " pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:00 crc kubenswrapper[4735]: I0317 03:12:00.470714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:01 crc kubenswrapper[4735]: I0317 03:12:01.060710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561952-dcxvk"] Mar 17 03:12:01 crc kubenswrapper[4735]: I0317 03:12:01.571254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" event={"ID":"2a04dd83-8815-41ec-b2d2-eece5171d802","Type":"ContainerStarted","Data":"e06c270d7afe285891d6de0044513641ed1be0246cb57a71d92f9226412c58ea"} Mar 17 03:12:02 crc kubenswrapper[4735]: I0317 03:12:02.589800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" event={"ID":"2a04dd83-8815-41ec-b2d2-eece5171d802","Type":"ContainerStarted","Data":"443841550cd14b8b5c2d046a2c8a05af09bbbf36f454a6b30189a4cd6f54f742"} Mar 17 03:12:03 crc kubenswrapper[4735]: I0317 03:12:03.604642 4735 generic.go:334] "Generic (PLEG): container finished" podID="2a04dd83-8815-41ec-b2d2-eece5171d802" containerID="443841550cd14b8b5c2d046a2c8a05af09bbbf36f454a6b30189a4cd6f54f742" exitCode=0 Mar 17 03:12:03 crc kubenswrapper[4735]: I0317 03:12:03.604731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" event={"ID":"2a04dd83-8815-41ec-b2d2-eece5171d802","Type":"ContainerDied","Data":"443841550cd14b8b5c2d046a2c8a05af09bbbf36f454a6b30189a4cd6f54f742"} Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.080816 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:12:05 crc kubenswrapper[4735]: E0317 03:12:05.081269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.095393 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.151366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs5cx\" (UniqueName: \"kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx\") pod \"2a04dd83-8815-41ec-b2d2-eece5171d802\" (UID: \"2a04dd83-8815-41ec-b2d2-eece5171d802\") " Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.159496 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx" (OuterVolumeSpecName: "kube-api-access-cs5cx") pod "2a04dd83-8815-41ec-b2d2-eece5171d802" (UID: "2a04dd83-8815-41ec-b2d2-eece5171d802"). InnerVolumeSpecName "kube-api-access-cs5cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.254258 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs5cx\" (UniqueName: \"kubernetes.io/projected/2a04dd83-8815-41ec-b2d2-eece5171d802-kube-api-access-cs5cx\") on node \"crc\" DevicePath \"\"" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.623578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" event={"ID":"2a04dd83-8815-41ec-b2d2-eece5171d802","Type":"ContainerDied","Data":"e06c270d7afe285891d6de0044513641ed1be0246cb57a71d92f9226412c58ea"} Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.623932 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e06c270d7afe285891d6de0044513641ed1be0246cb57a71d92f9226412c58ea" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.623991 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561952-dcxvk" Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.747991 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561946-ndnfg"] Mar 17 03:12:05 crc kubenswrapper[4735]: I0317 03:12:05.757650 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561946-ndnfg"] Mar 17 03:12:07 crc kubenswrapper[4735]: I0317 03:12:07.083466 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb7fb6d-743a-422c-8d87-2ff9d93e84f6" path="/var/lib/kubelet/pods/bcb7fb6d-743a-422c-8d87-2ff9d93e84f6/volumes" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.350086 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:19 crc kubenswrapper[4735]: E0317 03:12:19.352269 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a04dd83-8815-41ec-b2d2-eece5171d802" containerName="oc" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.352376 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a04dd83-8815-41ec-b2d2-eece5171d802" containerName="oc" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.352705 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a04dd83-8815-41ec-b2d2-eece5171d802" containerName="oc" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.354513 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.364791 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.438640 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.438718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.438841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4fc\" (UniqueName: \"kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.540912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.541031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.541182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4fc\" (UniqueName: \"kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.541515 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.541548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.563356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4fc\" (UniqueName: \"kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc\") pod \"community-operators-vwnmr\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:19 crc kubenswrapper[4735]: I0317 03:12:19.678312 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:20 crc kubenswrapper[4735]: I0317 03:12:20.077789 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:12:20 crc kubenswrapper[4735]: E0317 03:12:20.078633 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:12:20 crc kubenswrapper[4735]: I0317 03:12:20.336373 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:21 crc kubenswrapper[4735]: I0317 03:12:21.120502 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerID="45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c" exitCode=0 Mar 17 03:12:21 crc kubenswrapper[4735]: I0317 03:12:21.120779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerDied","Data":"45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c"} Mar 17 03:12:21 crc kubenswrapper[4735]: I0317 03:12:21.120847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerStarted","Data":"d54c527be7fa61827b5c74dd8f9e9fe10a47e6653c158990b7bcd6e43860770a"} Mar 17 03:12:22 crc kubenswrapper[4735]: I0317 03:12:22.131432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerStarted","Data":"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb"} Mar 17 03:12:24 crc kubenswrapper[4735]: I0317 03:12:24.153320 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerID="7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb" exitCode=0 Mar 17 03:12:24 crc kubenswrapper[4735]: I0317 03:12:24.153408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerDied","Data":"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb"} Mar 17 03:12:25 crc kubenswrapper[4735]: I0317 03:12:25.207126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerStarted","Data":"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725"} Mar 17 03:12:25 crc kubenswrapper[4735]: I0317 03:12:25.242081 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vwnmr" podStartSLOduration=2.808532165 podStartE2EDuration="6.242062874s" podCreationTimestamp="2026-03-17 03:12:19 +0000 UTC" firstStartedPulling="2026-03-17 03:12:21.123057144 +0000 UTC m=+7366.755290122" lastFinishedPulling="2026-03-17 03:12:24.556587803 +0000 UTC m=+7370.188820831" observedRunningTime="2026-03-17 03:12:25.233756074 +0000 UTC m=+7370.865989052" watchObservedRunningTime="2026-03-17 03:12:25.242062874 +0000 UTC m=+7370.874295852" Mar 17 03:12:29 crc kubenswrapper[4735]: I0317 03:12:29.678950 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:29 crc kubenswrapper[4735]: I0317 03:12:29.679433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:29 crc kubenswrapper[4735]: I0317 03:12:29.754640 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:30 crc kubenswrapper[4735]: I0317 03:12:30.313664 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:30 crc kubenswrapper[4735]: I0317 03:12:30.376966 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:31 crc kubenswrapper[4735]: I0317 03:12:31.075040 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:12:31 crc kubenswrapper[4735]: E0317 03:12:31.075465 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:12:32 crc kubenswrapper[4735]: I0317 03:12:32.276326 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vwnmr" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="registry-server" containerID="cri-o://b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725" gracePeriod=2 Mar 17 03:12:32 crc kubenswrapper[4735]: I0317 03:12:32.983536 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.046752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities\") pod \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.046988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr4fc\" (UniqueName: \"kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc\") pod \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.047077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content\") pod \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\" (UID: \"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1\") " Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.047635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities" (OuterVolumeSpecName: "utilities") pod "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" (UID: "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.085154 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc" (OuterVolumeSpecName: "kube-api-access-wr4fc") pod "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" (UID: "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1"). InnerVolumeSpecName "kube-api-access-wr4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.118174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" (UID: "2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.148326 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr4fc\" (UniqueName: \"kubernetes.io/projected/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-kube-api-access-wr4fc\") on node \"crc\" DevicePath \"\"" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.148356 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.148366 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.288096 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerID="b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725" exitCode=0 Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.288150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerDied","Data":"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725"} Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.288183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwnmr" event={"ID":"2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1","Type":"ContainerDied","Data":"d54c527be7fa61827b5c74dd8f9e9fe10a47e6653c158990b7bcd6e43860770a"} Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.288204 4735 scope.go:117] "RemoveContainer" containerID="b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.288385 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwnmr" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.318628 4735 scope.go:117] "RemoveContainer" containerID="7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.333713 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.341602 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vwnmr"] Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.350309 4735 scope.go:117] "RemoveContainer" containerID="45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.394742 4735 scope.go:117] "RemoveContainer" containerID="b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725" Mar 17 03:12:33 crc kubenswrapper[4735]: E0317 03:12:33.395238 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725\": container with ID starting with b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725 not found: ID does not exist" containerID="b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.395310 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725"} err="failed to get container status \"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725\": rpc error: code = NotFound desc = could not find container \"b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725\": container with ID starting with b0460751411a956524d64f5dbd8d7b4fbbe05855eabb25c1f808f5e5fa8e6725 not found: ID does not exist" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.395342 4735 scope.go:117] "RemoveContainer" containerID="7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb" Mar 17 03:12:33 crc kubenswrapper[4735]: E0317 03:12:33.395960 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb\": container with ID starting with 7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb not found: ID does not exist" containerID="7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.396008 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb"} err="failed to get container status \"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb\": rpc error: code = NotFound desc = could not find container \"7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb\": container with ID starting with 7895275bd9aabbf5d8f0bb9d7757fdfb8daf20252ab40572cdc1d1a05164b5bb not found: ID does not exist" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.396033 4735 scope.go:117] "RemoveContainer" containerID="45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c" Mar 17 03:12:33 crc kubenswrapper[4735]: E0317 03:12:33.396257 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c\": container with ID starting with 45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c not found: ID does not exist" containerID="45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c" Mar 17 03:12:33 crc kubenswrapper[4735]: I0317 03:12:33.396275 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c"} err="failed to get container status \"45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c\": rpc error: code = NotFound desc = could not find container \"45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c\": container with ID starting with 45ad23607202b5cc07230e66d2c38aec5c93eb382298f26eaed5f352c9e6464c not found: ID does not exist" Mar 17 03:12:35 crc kubenswrapper[4735]: I0317 03:12:35.090338 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" path="/var/lib/kubelet/pods/2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1/volumes" Mar 17 03:12:42 crc kubenswrapper[4735]: I0317 03:12:42.073462 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:12:42 crc kubenswrapper[4735]: E0317 03:12:42.074395 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:12:57 crc kubenswrapper[4735]: I0317 03:12:57.073166 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:12:57 crc kubenswrapper[4735]: E0317 03:12:57.073995 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:13:05 crc kubenswrapper[4735]: I0317 03:13:05.816130 4735 scope.go:117] "RemoveContainer" containerID="9d83dd8eadd3682de3feddd0337a273d9a550d87247d45ddf3e37cf91be76686" Mar 17 03:13:09 crc kubenswrapper[4735]: I0317 03:13:09.073024 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:13:09 crc kubenswrapper[4735]: E0317 03:13:09.073677 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:13:23 crc kubenswrapper[4735]: I0317 03:13:23.073921 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:13:23 crc kubenswrapper[4735]: E0317 03:13:23.075259 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:13:38 crc kubenswrapper[4735]: I0317 03:13:38.074199 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:13:38 crc kubenswrapper[4735]: E0317 03:13:38.075118 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:13:51 crc kubenswrapper[4735]: I0317 03:13:51.077949 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:13:51 crc kubenswrapper[4735]: E0317 03:13:51.080318 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.157573 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561954-w8ddm"] Mar 17 03:14:00 crc kubenswrapper[4735]: E0317 03:14:00.158555 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="extract-content" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.158570 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="extract-content" Mar 17 03:14:00 crc kubenswrapper[4735]: E0317 03:14:00.158605 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="registry-server" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.158613 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="registry-server" Mar 17 03:14:00 crc kubenswrapper[4735]: E0317 03:14:00.158627 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="extract-utilities" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.158636 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="extract-utilities" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.158883 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea42a2d-1ccb-44b8-8fae-12ee1b867cb1" containerName="registry-server" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.159795 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.167176 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.167213 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.167246 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.172660 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561954-w8ddm"] Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.322338 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db4z9\" (UniqueName: \"kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9\") pod \"auto-csr-approver-29561954-w8ddm\" (UID: \"0f58bf8d-23a6-4437-b77d-c5859fd49afc\") " pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.424816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db4z9\" (UniqueName: \"kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9\") pod \"auto-csr-approver-29561954-w8ddm\" (UID: \"0f58bf8d-23a6-4437-b77d-c5859fd49afc\") " pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.450244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db4z9\" (UniqueName: \"kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9\") pod \"auto-csr-approver-29561954-w8ddm\" (UID: \"0f58bf8d-23a6-4437-b77d-c5859fd49afc\") " pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.490005 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.491818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.492664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.596424 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.629829 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.630148 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mgwv\" (UniqueName: \"kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.630213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.731716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.732089 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mgwv\" (UniqueName: \"kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.732120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.732394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.732538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.759396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mgwv\" (UniqueName: \"kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv\") pod \"redhat-marketplace-pjw9b\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:00 crc kubenswrapper[4735]: I0317 03:14:00.847354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:01 crc kubenswrapper[4735]: I0317 03:14:01.134525 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561954-w8ddm"] Mar 17 03:14:01 crc kubenswrapper[4735]: I0317 03:14:01.204773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" event={"ID":"0f58bf8d-23a6-4437-b77d-c5859fd49afc","Type":"ContainerStarted","Data":"0be0f9ddb7c8b3d4bb049669018e69e70ead898e268514c35f28df2d7aa7b51c"} Mar 17 03:14:01 crc kubenswrapper[4735]: I0317 03:14:01.415392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:02 crc kubenswrapper[4735]: I0317 03:14:02.219057 4735 generic.go:334] "Generic (PLEG): container finished" podID="acb002c8-d026-447e-b912-dd0896701664" containerID="17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81" exitCode=0 Mar 17 03:14:02 crc kubenswrapper[4735]: I0317 03:14:02.219106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerDied","Data":"17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81"} Mar 17 03:14:02 crc kubenswrapper[4735]: I0317 03:14:02.219451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerStarted","Data":"e55b90beaa3ffc26c521ed17e5f6549ffb00f22c99b0caac4e0a41d353b99794"} Mar 17 03:14:03 crc kubenswrapper[4735]: I0317 03:14:03.229340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" event={"ID":"0f58bf8d-23a6-4437-b77d-c5859fd49afc","Type":"ContainerStarted","Data":"333fab1a50203b7e143c8a66addf28d9292ecd1883c32c6263acf85dfa43f7c4"} Mar 17 03:14:03 crc kubenswrapper[4735]: I0317 03:14:03.232536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerStarted","Data":"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377"} Mar 17 03:14:03 crc kubenswrapper[4735]: I0317 03:14:03.254504 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" podStartSLOduration=2.119753616 podStartE2EDuration="3.254485086s" podCreationTimestamp="2026-03-17 03:14:00 +0000 UTC" firstStartedPulling="2026-03-17 03:14:01.145337502 +0000 UTC m=+7466.777570480" lastFinishedPulling="2026-03-17 03:14:02.280068932 +0000 UTC m=+7467.912301950" observedRunningTime="2026-03-17 03:14:03.249240939 +0000 UTC m=+7468.881473928" watchObservedRunningTime="2026-03-17 03:14:03.254485086 +0000 UTC m=+7468.886718074" Mar 17 03:14:04 crc kubenswrapper[4735]: I0317 03:14:04.242101 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f58bf8d-23a6-4437-b77d-c5859fd49afc" containerID="333fab1a50203b7e143c8a66addf28d9292ecd1883c32c6263acf85dfa43f7c4" exitCode=0 Mar 17 03:14:04 crc kubenswrapper[4735]: I0317 03:14:04.242893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" event={"ID":"0f58bf8d-23a6-4437-b77d-c5859fd49afc","Type":"ContainerDied","Data":"333fab1a50203b7e143c8a66addf28d9292ecd1883c32c6263acf85dfa43f7c4"} Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.080702 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:14:05 crc kubenswrapper[4735]: E0317 03:14:05.081229 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.252784 4735 generic.go:334] "Generic (PLEG): container finished" podID="acb002c8-d026-447e-b912-dd0896701664" containerID="44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377" exitCode=0 Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.253043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerDied","Data":"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377"} Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.695909 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.848776 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db4z9\" (UniqueName: \"kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9\") pod \"0f58bf8d-23a6-4437-b77d-c5859fd49afc\" (UID: \"0f58bf8d-23a6-4437-b77d-c5859fd49afc\") " Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.855685 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9" (OuterVolumeSpecName: "kube-api-access-db4z9") pod "0f58bf8d-23a6-4437-b77d-c5859fd49afc" (UID: "0f58bf8d-23a6-4437-b77d-c5859fd49afc"). InnerVolumeSpecName "kube-api-access-db4z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:14:05 crc kubenswrapper[4735]: I0317 03:14:05.951020 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db4z9\" (UniqueName: \"kubernetes.io/projected/0f58bf8d-23a6-4437-b77d-c5859fd49afc-kube-api-access-db4z9\") on node \"crc\" DevicePath \"\"" Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.265052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" event={"ID":"0f58bf8d-23a6-4437-b77d-c5859fd49afc","Type":"ContainerDied","Data":"0be0f9ddb7c8b3d4bb049669018e69e70ead898e268514c35f28df2d7aa7b51c"} Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.265089 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561954-w8ddm" Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.265094 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be0f9ddb7c8b3d4bb049669018e69e70ead898e268514c35f28df2d7aa7b51c" Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.268747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerStarted","Data":"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54"} Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.320486 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pjw9b" podStartSLOduration=2.824629365 podStartE2EDuration="6.320463501s" podCreationTimestamp="2026-03-17 03:14:00 +0000 UTC" firstStartedPulling="2026-03-17 03:14:02.223164724 +0000 UTC m=+7467.855397692" lastFinishedPulling="2026-03-17 03:14:05.71899885 +0000 UTC m=+7471.351231828" observedRunningTime="2026-03-17 03:14:06.306641429 +0000 UTC m=+7471.938874427" watchObservedRunningTime="2026-03-17 03:14:06.320463501 +0000 UTC m=+7471.952696479" Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.341455 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561948-9sxpm"] Mar 17 03:14:06 crc kubenswrapper[4735]: I0317 03:14:06.354794 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561948-9sxpm"] Mar 17 03:14:07 crc kubenswrapper[4735]: I0317 03:14:07.083534 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3637e311-d76f-4485-a03b-add96cb61333" path="/var/lib/kubelet/pods/3637e311-d76f-4485-a03b-add96cb61333/volumes" Mar 17 03:14:10 crc kubenswrapper[4735]: I0317 03:14:10.847566 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:10 crc kubenswrapper[4735]: I0317 03:14:10.848174 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:11 crc kubenswrapper[4735]: I0317 03:14:11.900712 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pjw9b" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="registry-server" probeResult="failure" output=< Mar 17 03:14:11 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:14:11 crc kubenswrapper[4735]: > Mar 17 03:14:18 crc kubenswrapper[4735]: I0317 03:14:18.082534 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:14:18 crc kubenswrapper[4735]: E0317 03:14:18.086878 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:14:20 crc kubenswrapper[4735]: I0317 03:14:20.914996 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:20 crc kubenswrapper[4735]: I0317 03:14:20.978186 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:21 crc kubenswrapper[4735]: I0317 03:14:21.169104 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:22 crc kubenswrapper[4735]: I0317 03:14:22.452765 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pjw9b" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="registry-server" containerID="cri-o://d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54" gracePeriod=2 Mar 17 03:14:22 crc kubenswrapper[4735]: I0317 03:14:22.950195 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.153331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content\") pod \"acb002c8-d026-447e-b912-dd0896701664\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.153544 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mgwv\" (UniqueName: \"kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv\") pod \"acb002c8-d026-447e-b912-dd0896701664\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.153647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities\") pod \"acb002c8-d026-447e-b912-dd0896701664\" (UID: \"acb002c8-d026-447e-b912-dd0896701664\") " Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.154474 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities" (OuterVolumeSpecName: "utilities") pod "acb002c8-d026-447e-b912-dd0896701664" (UID: "acb002c8-d026-447e-b912-dd0896701664"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.164050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv" (OuterVolumeSpecName: "kube-api-access-6mgwv") pod "acb002c8-d026-447e-b912-dd0896701664" (UID: "acb002c8-d026-447e-b912-dd0896701664"). InnerVolumeSpecName "kube-api-access-6mgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.183551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acb002c8-d026-447e-b912-dd0896701664" (UID: "acb002c8-d026-447e-b912-dd0896701664"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.256038 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.257307 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mgwv\" (UniqueName: \"kubernetes.io/projected/acb002c8-d026-447e-b912-dd0896701664-kube-api-access-6mgwv\") on node \"crc\" DevicePath \"\"" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.257388 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb002c8-d026-447e-b912-dd0896701664-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.462087 4735 generic.go:334] "Generic (PLEG): container finished" podID="acb002c8-d026-447e-b912-dd0896701664" containerID="d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54" exitCode=0 Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.462144 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjw9b" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.462144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerDied","Data":"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54"} Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.463556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjw9b" event={"ID":"acb002c8-d026-447e-b912-dd0896701664","Type":"ContainerDied","Data":"e55b90beaa3ffc26c521ed17e5f6549ffb00f22c99b0caac4e0a41d353b99794"} Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.463581 4735 scope.go:117] "RemoveContainer" containerID="d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.483389 4735 scope.go:117] "RemoveContainer" containerID="44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.515458 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.522423 4735 scope.go:117] "RemoveContainer" containerID="17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.532433 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjw9b"] Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.582358 4735 scope.go:117] "RemoveContainer" containerID="d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54" Mar 17 03:14:23 crc kubenswrapper[4735]: E0317 03:14:23.582846 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54\": container with ID starting with d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54 not found: ID does not exist" containerID="d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.582953 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54"} err="failed to get container status \"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54\": rpc error: code = NotFound desc = could not find container \"d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54\": container with ID starting with d401f32456ed79e6e5b6183560a56a0e66a6b8c69ead727802e001aafa749a54 not found: ID does not exist" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.582977 4735 scope.go:117] "RemoveContainer" containerID="44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377" Mar 17 03:14:23 crc kubenswrapper[4735]: E0317 03:14:23.583425 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377\": container with ID starting with 44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377 not found: ID does not exist" containerID="44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.583451 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377"} err="failed to get container status \"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377\": rpc error: code = NotFound desc = could not find container \"44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377\": container with ID starting with 44a19fda7344a7bdd0bf1829b868baf414177ed8689ccf97dc2716bd7dfad377 not found: ID does not exist" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.583465 4735 scope.go:117] "RemoveContainer" containerID="17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81" Mar 17 03:14:23 crc kubenswrapper[4735]: E0317 03:14:23.583701 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81\": container with ID starting with 17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81 not found: ID does not exist" containerID="17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81" Mar 17 03:14:23 crc kubenswrapper[4735]: I0317 03:14:23.583722 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81"} err="failed to get container status \"17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81\": rpc error: code = NotFound desc = could not find container \"17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81\": container with ID starting with 17f83a0438bc942d5e7e42a6b28dbff7ce20f42ebef6afffc4bfa84dcd56ac81 not found: ID does not exist" Mar 17 03:14:25 crc kubenswrapper[4735]: I0317 03:14:25.093426 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb002c8-d026-447e-b912-dd0896701664" path="/var/lib/kubelet/pods/acb002c8-d026-447e-b912-dd0896701664/volumes" Mar 17 03:14:33 crc kubenswrapper[4735]: I0317 03:14:33.073817 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:14:33 crc kubenswrapper[4735]: E0317 03:14:33.075164 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:14:45 crc kubenswrapper[4735]: I0317 03:14:45.086546 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:14:45 crc kubenswrapper[4735]: E0317 03:14:45.087763 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.072665 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:15:00 crc kubenswrapper[4735]: E0317 03:15:00.076367 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.169384 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx"] Mar 17 03:15:00 crc kubenswrapper[4735]: E0317 03:15:00.170649 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f58bf8d-23a6-4437-b77d-c5859fd49afc" containerName="oc" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.170761 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f58bf8d-23a6-4437-b77d-c5859fd49afc" containerName="oc" Mar 17 03:15:00 crc kubenswrapper[4735]: E0317 03:15:00.170909 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="extract-content" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.171001 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="extract-content" Mar 17 03:15:00 crc kubenswrapper[4735]: E0317 03:15:00.171097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="registry-server" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.171193 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="registry-server" Mar 17 03:15:00 crc kubenswrapper[4735]: E0317 03:15:00.171298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="extract-utilities" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.171374 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="extract-utilities" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.172901 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb002c8-d026-447e-b912-dd0896701664" containerName="registry-server" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.173043 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f58bf8d-23a6-4437-b77d-c5859fd49afc" containerName="oc" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.173844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.176414 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.177316 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.178623 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx"] Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.349677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.349764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4474f\" (UniqueName: \"kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.349821 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.451297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.451615 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4474f\" (UniqueName: \"kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.451758 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.452700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.462596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.484080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4474f\" (UniqueName: \"kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f\") pod \"collect-profiles-29561955-mmfvx\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.505285 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:00 crc kubenswrapper[4735]: I0317 03:15:00.960996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx"] Mar 17 03:15:01 crc kubenswrapper[4735]: I0317 03:15:01.915942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" event={"ID":"0ecf8c9f-f094-4249-8377-28d3fa0cf02a","Type":"ContainerStarted","Data":"177f37481e6f4d4b57b9b837f3ae5611c0bbd75c8ded87ed360d29c984a03329"} Mar 17 03:15:01 crc kubenswrapper[4735]: I0317 03:15:01.916294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" event={"ID":"0ecf8c9f-f094-4249-8377-28d3fa0cf02a","Type":"ContainerStarted","Data":"e81429b5e30d1965af89479b8e90937415f3a982d7a33f80d6bc371b4b0d4dc9"} Mar 17 03:15:02 crc kubenswrapper[4735]: I0317 03:15:02.929616 4735 generic.go:334] "Generic (PLEG): container finished" podID="0ecf8c9f-f094-4249-8377-28d3fa0cf02a" containerID="177f37481e6f4d4b57b9b837f3ae5611c0bbd75c8ded87ed360d29c984a03329" exitCode=0 Mar 17 03:15:02 crc kubenswrapper[4735]: I0317 03:15:02.929703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" event={"ID":"0ecf8c9f-f094-4249-8377-28d3fa0cf02a","Type":"ContainerDied","Data":"177f37481e6f4d4b57b9b837f3ae5611c0bbd75c8ded87ed360d29c984a03329"} Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.494935 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.621183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4474f\" (UniqueName: \"kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f\") pod \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.621331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume\") pod \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.621385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume\") pod \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\" (UID: \"0ecf8c9f-f094-4249-8377-28d3fa0cf02a\") " Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.622117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ecf8c9f-f094-4249-8377-28d3fa0cf02a" (UID: "0ecf8c9f-f094-4249-8377-28d3fa0cf02a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.628011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f" (OuterVolumeSpecName: "kube-api-access-4474f") pod "0ecf8c9f-f094-4249-8377-28d3fa0cf02a" (UID: "0ecf8c9f-f094-4249-8377-28d3fa0cf02a"). InnerVolumeSpecName "kube-api-access-4474f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.629014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ecf8c9f-f094-4249-8377-28d3fa0cf02a" (UID: "0ecf8c9f-f094-4249-8377-28d3fa0cf02a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.725240 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4474f\" (UniqueName: \"kubernetes.io/projected/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-kube-api-access-4474f\") on node \"crc\" DevicePath \"\"" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.725528 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.725657 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ecf8c9f-f094-4249-8377-28d3fa0cf02a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.944015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" event={"ID":"0ecf8c9f-f094-4249-8377-28d3fa0cf02a","Type":"ContainerDied","Data":"e81429b5e30d1965af89479b8e90937415f3a982d7a33f80d6bc371b4b0d4dc9"} Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.944953 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e81429b5e30d1965af89479b8e90937415f3a982d7a33f80d6bc371b4b0d4dc9" Mar 17 03:15:03 crc kubenswrapper[4735]: I0317 03:15:03.944045 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx" Mar 17 03:15:04 crc kubenswrapper[4735]: I0317 03:15:04.592921 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn"] Mar 17 03:15:04 crc kubenswrapper[4735]: I0317 03:15:04.602758 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-dkgcn"] Mar 17 03:15:05 crc kubenswrapper[4735]: I0317 03:15:05.094513 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7241f2-ed80-4f1d-8c72-56826fd49958" path="/var/lib/kubelet/pods/7a7241f2-ed80-4f1d-8c72-56826fd49958/volumes" Mar 17 03:15:05 crc kubenswrapper[4735]: I0317 03:15:05.934534 4735 scope.go:117] "RemoveContainer" containerID="2a84b0e631cac7bfc5da742b08be32432e3735e53c23c0e186db6266b057575d" Mar 17 03:15:06 crc kubenswrapper[4735]: I0317 03:15:06.019051 4735 scope.go:117] "RemoveContainer" containerID="642f1d5f3d65cc61c1a39810025ecdd6d13738f5f1be49c48905f0bddd9b490e" Mar 17 03:15:13 crc kubenswrapper[4735]: I0317 03:15:13.073151 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:15:14 crc kubenswrapper[4735]: I0317 03:15:14.092970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a"} Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.157528 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561956-kl52k"] Mar 17 03:16:00 crc kubenswrapper[4735]: E0317 03:16:00.158847 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecf8c9f-f094-4249-8377-28d3fa0cf02a" containerName="collect-profiles" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.158899 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecf8c9f-f094-4249-8377-28d3fa0cf02a" containerName="collect-profiles" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.159242 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecf8c9f-f094-4249-8377-28d3fa0cf02a" containerName="collect-profiles" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.160317 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.162799 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.162799 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.163097 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.172798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb969\" (UniqueName: \"kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969\") pod \"auto-csr-approver-29561956-kl52k\" (UID: \"e806fb2d-451a-45dc-9a4d-06798f4d7f6c\") " pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.175136 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561956-kl52k"] Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.274457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb969\" (UniqueName: \"kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969\") pod \"auto-csr-approver-29561956-kl52k\" (UID: \"e806fb2d-451a-45dc-9a4d-06798f4d7f6c\") " pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.302634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb969\" (UniqueName: \"kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969\") pod \"auto-csr-approver-29561956-kl52k\" (UID: \"e806fb2d-451a-45dc-9a4d-06798f4d7f6c\") " pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.484388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.981552 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561956-kl52k"] Mar 17 03:16:00 crc kubenswrapper[4735]: I0317 03:16:00.999619 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:16:01 crc kubenswrapper[4735]: I0317 03:16:01.988418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561956-kl52k" event={"ID":"e806fb2d-451a-45dc-9a4d-06798f4d7f6c","Type":"ContainerStarted","Data":"7dfdeb61538d2bf6031f80a8fd82920a5197d72a5b1ed83adf27f5adbac26557"} Mar 17 03:16:03 crc kubenswrapper[4735]: I0317 03:16:03.001824 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561956-kl52k" event={"ID":"e806fb2d-451a-45dc-9a4d-06798f4d7f6c","Type":"ContainerStarted","Data":"2a958e631d83fecfebd5e9e5d3aa2be460561a6584e9b4ae561b5d6c76f8b838"} Mar 17 03:16:03 crc kubenswrapper[4735]: I0317 03:16:03.034521 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561956-kl52k" podStartSLOduration=2.163559614 podStartE2EDuration="3.034307616s" podCreationTimestamp="2026-03-17 03:16:00 +0000 UTC" firstStartedPulling="2026-03-17 03:16:00.996057601 +0000 UTC m=+7586.628290609" lastFinishedPulling="2026-03-17 03:16:01.866805633 +0000 UTC m=+7587.499038611" observedRunningTime="2026-03-17 03:16:03.022328767 +0000 UTC m=+7588.654561745" watchObservedRunningTime="2026-03-17 03:16:03.034307616 +0000 UTC m=+7588.666540614" Mar 17 03:16:04 crc kubenswrapper[4735]: I0317 03:16:04.018989 4735 generic.go:334] "Generic (PLEG): container finished" podID="e806fb2d-451a-45dc-9a4d-06798f4d7f6c" containerID="2a958e631d83fecfebd5e9e5d3aa2be460561a6584e9b4ae561b5d6c76f8b838" exitCode=0 Mar 17 03:16:04 crc kubenswrapper[4735]: I0317 03:16:04.019054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561956-kl52k" event={"ID":"e806fb2d-451a-45dc-9a4d-06798f4d7f6c","Type":"ContainerDied","Data":"2a958e631d83fecfebd5e9e5d3aa2be460561a6584e9b4ae561b5d6c76f8b838"} Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.237328 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.240073 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.261072 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.397458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.397517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj4s8\" (UniqueName: \"kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.397645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.500315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.500384 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.500413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj4s8\" (UniqueName: \"kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.500932 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.501171 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.525582 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj4s8\" (UniqueName: \"kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8\") pod \"certified-operators-sfr2l\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.562396 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.582347 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.703762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb969\" (UniqueName: \"kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969\") pod \"e806fb2d-451a-45dc-9a4d-06798f4d7f6c\" (UID: \"e806fb2d-451a-45dc-9a4d-06798f4d7f6c\") " Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.711455 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969" (OuterVolumeSpecName: "kube-api-access-zb969") pod "e806fb2d-451a-45dc-9a4d-06798f4d7f6c" (UID: "e806fb2d-451a-45dc-9a4d-06798f4d7f6c"). InnerVolumeSpecName "kube-api-access-zb969". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:16:05 crc kubenswrapper[4735]: I0317 03:16:05.808839 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb969\" (UniqueName: \"kubernetes.io/projected/e806fb2d-451a-45dc-9a4d-06798f4d7f6c-kube-api-access-zb969\") on node \"crc\" DevicePath \"\"" Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.057835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561956-kl52k" event={"ID":"e806fb2d-451a-45dc-9a4d-06798f4d7f6c","Type":"ContainerDied","Data":"7dfdeb61538d2bf6031f80a8fd82920a5197d72a5b1ed83adf27f5adbac26557"} Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.058143 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfdeb61538d2bf6031f80a8fd82920a5197d72a5b1ed83adf27f5adbac26557" Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.057930 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561956-kl52k" Mar 17 03:16:06 crc kubenswrapper[4735]: W0317 03:16:06.092924 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded790cdd_1801_4b18_965d_17a6b8f2dee6.slice/crio-f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6 WatchSource:0}: Error finding container f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6: Status 404 returned error can't find the container with id f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6 Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.100394 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.137772 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561950-9cn7g"] Mar 17 03:16:06 crc kubenswrapper[4735]: I0317 03:16:06.174834 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561950-9cn7g"] Mar 17 03:16:07 crc kubenswrapper[4735]: I0317 03:16:07.071663 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerID="823f8951a9235e8157a306ae9aef31af292117516bda2e9f620bdcf2386a20a2" exitCode=0 Mar 17 03:16:07 crc kubenswrapper[4735]: I0317 03:16:07.071736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerDied","Data":"823f8951a9235e8157a306ae9aef31af292117516bda2e9f620bdcf2386a20a2"} Mar 17 03:16:07 crc kubenswrapper[4735]: I0317 03:16:07.073044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerStarted","Data":"f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6"} Mar 17 03:16:07 crc kubenswrapper[4735]: I0317 03:16:07.095314 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d67eba-b70c-45df-8ca8-1c9950934639" path="/var/lib/kubelet/pods/d2d67eba-b70c-45df-8ca8-1c9950934639/volumes" Mar 17 03:16:08 crc kubenswrapper[4735]: I0317 03:16:08.084279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerStarted","Data":"99c9c5e57eabbc22fb4687621a2b5105e2a795e1147f9c9720fe393f2dfd71fe"} Mar 17 03:16:10 crc kubenswrapper[4735]: I0317 03:16:10.104288 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerID="99c9c5e57eabbc22fb4687621a2b5105e2a795e1147f9c9720fe393f2dfd71fe" exitCode=0 Mar 17 03:16:10 crc kubenswrapper[4735]: I0317 03:16:10.104660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerDied","Data":"99c9c5e57eabbc22fb4687621a2b5105e2a795e1147f9c9720fe393f2dfd71fe"} Mar 17 03:16:11 crc kubenswrapper[4735]: I0317 03:16:11.117243 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerStarted","Data":"d6e117b77af9517b5d14d38ef1b90b88a0d1685faab02dbe743107a18a454101"} Mar 17 03:16:11 crc kubenswrapper[4735]: I0317 03:16:11.146064 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfr2l" podStartSLOduration=2.666240758 podStartE2EDuration="6.146047318s" podCreationTimestamp="2026-03-17 03:16:05 +0000 UTC" firstStartedPulling="2026-03-17 03:16:07.074474999 +0000 UTC m=+7592.706707977" lastFinishedPulling="2026-03-17 03:16:10.554281549 +0000 UTC m=+7596.186514537" observedRunningTime="2026-03-17 03:16:11.137559834 +0000 UTC m=+7596.769792812" watchObservedRunningTime="2026-03-17 03:16:11.146047318 +0000 UTC m=+7596.778280286" Mar 17 03:16:15 crc kubenswrapper[4735]: I0317 03:16:15.564138 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:15 crc kubenswrapper[4735]: I0317 03:16:15.566104 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:16 crc kubenswrapper[4735]: I0317 03:16:16.637317 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sfr2l" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="registry-server" probeResult="failure" output=< Mar 17 03:16:16 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:16:16 crc kubenswrapper[4735]: > Mar 17 03:16:25 crc kubenswrapper[4735]: I0317 03:16:25.674058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:25 crc kubenswrapper[4735]: I0317 03:16:25.746351 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:25 crc kubenswrapper[4735]: I0317 03:16:25.910447 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:27 crc kubenswrapper[4735]: I0317 03:16:27.278285 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfr2l" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="registry-server" containerID="cri-o://d6e117b77af9517b5d14d38ef1b90b88a0d1685faab02dbe743107a18a454101" gracePeriod=2 Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.288239 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerID="d6e117b77af9517b5d14d38ef1b90b88a0d1685faab02dbe743107a18a454101" exitCode=0 Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.288285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerDied","Data":"d6e117b77af9517b5d14d38ef1b90b88a0d1685faab02dbe743107a18a454101"} Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.288313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2l" event={"ID":"ed790cdd-1801-4b18-965d-17a6b8f2dee6","Type":"ContainerDied","Data":"f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6"} Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.288327 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ce1143059012e3a663cba61129822d27107b50a700f764137c81cec7b848d6" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.305537 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.391291 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj4s8\" (UniqueName: \"kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8\") pod \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.391487 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities\") pod \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.391545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content\") pod \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\" (UID: \"ed790cdd-1801-4b18-965d-17a6b8f2dee6\") " Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.392707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities" (OuterVolumeSpecName: "utilities") pod "ed790cdd-1801-4b18-965d-17a6b8f2dee6" (UID: "ed790cdd-1801-4b18-965d-17a6b8f2dee6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.406100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8" (OuterVolumeSpecName: "kube-api-access-hj4s8") pod "ed790cdd-1801-4b18-965d-17a6b8f2dee6" (UID: "ed790cdd-1801-4b18-965d-17a6b8f2dee6"). InnerVolumeSpecName "kube-api-access-hj4s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.441738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed790cdd-1801-4b18-965d-17a6b8f2dee6" (UID: "ed790cdd-1801-4b18-965d-17a6b8f2dee6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.493314 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj4s8\" (UniqueName: \"kubernetes.io/projected/ed790cdd-1801-4b18-965d-17a6b8f2dee6-kube-api-access-hj4s8\") on node \"crc\" DevicePath \"\"" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.493351 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:16:28 crc kubenswrapper[4735]: I0317 03:16:28.493364 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed790cdd-1801-4b18-965d-17a6b8f2dee6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:16:29 crc kubenswrapper[4735]: I0317 03:16:29.294399 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2l" Mar 17 03:16:29 crc kubenswrapper[4735]: I0317 03:16:29.320119 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:29 crc kubenswrapper[4735]: I0317 03:16:29.329203 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfr2l"] Mar 17 03:16:31 crc kubenswrapper[4735]: I0317 03:16:31.083621 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" path="/var/lib/kubelet/pods/ed790cdd-1801-4b18-965d-17a6b8f2dee6/volumes" Mar 17 03:17:06 crc kubenswrapper[4735]: I0317 03:17:06.153982 4735 scope.go:117] "RemoveContainer" containerID="7efce6fc910680bf69ff3239f4cc23fbdc2c61a9c7cee3acba28e9732d80805b" Mar 17 03:17:42 crc kubenswrapper[4735]: I0317 03:17:42.606234 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:17:42 crc kubenswrapper[4735]: I0317 03:17:42.607438 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.165148 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561958-tpr5b"] Mar 17 03:18:00 crc kubenswrapper[4735]: E0317 03:18:00.166358 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e806fb2d-451a-45dc-9a4d-06798f4d7f6c" containerName="oc" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166381 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e806fb2d-451a-45dc-9a4d-06798f4d7f6c" containerName="oc" Mar 17 03:18:00 crc kubenswrapper[4735]: E0317 03:18:00.166414 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="registry-server" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166426 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="registry-server" Mar 17 03:18:00 crc kubenswrapper[4735]: E0317 03:18:00.166452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="extract-utilities" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166466 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="extract-utilities" Mar 17 03:18:00 crc kubenswrapper[4735]: E0317 03:18:00.166509 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="extract-content" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166521 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="extract-content" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166827 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed790cdd-1801-4b18-965d-17a6b8f2dee6" containerName="registry-server" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.166903 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e806fb2d-451a-45dc-9a4d-06798f4d7f6c" containerName="oc" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.168999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.171369 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.172242 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.176036 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.180067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561958-tpr5b"] Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.267976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qgr2\" (UniqueName: \"kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2\") pod \"auto-csr-approver-29561958-tpr5b\" (UID: \"33eb0484-f254-44cc-862e-9b274c0fabbc\") " pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.369369 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qgr2\" (UniqueName: \"kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2\") pod \"auto-csr-approver-29561958-tpr5b\" (UID: \"33eb0484-f254-44cc-862e-9b274c0fabbc\") " pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.394620 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qgr2\" (UniqueName: \"kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2\") pod \"auto-csr-approver-29561958-tpr5b\" (UID: \"33eb0484-f254-44cc-862e-9b274c0fabbc\") " pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:00 crc kubenswrapper[4735]: I0317 03:18:00.496686 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:01 crc kubenswrapper[4735]: I0317 03:18:01.011031 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561958-tpr5b"] Mar 17 03:18:01 crc kubenswrapper[4735]: W0317 03:18:01.023044 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33eb0484_f254_44cc_862e_9b274c0fabbc.slice/crio-bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2 WatchSource:0}: Error finding container bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2: Status 404 returned error can't find the container with id bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2 Mar 17 03:18:01 crc kubenswrapper[4735]: I0317 03:18:01.195427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" event={"ID":"33eb0484-f254-44cc-862e-9b274c0fabbc","Type":"ContainerStarted","Data":"bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2"} Mar 17 03:18:03 crc kubenswrapper[4735]: I0317 03:18:03.229312 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" event={"ID":"33eb0484-f254-44cc-862e-9b274c0fabbc","Type":"ContainerStarted","Data":"285bdbd71783bbc88ae918daa6d20f2f384a709ac2b7643f42e28448a7b42c6b"} Mar 17 03:18:03 crc kubenswrapper[4735]: I0317 03:18:03.244836 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" podStartSLOduration=2.116589318 podStartE2EDuration="3.244820955s" podCreationTimestamp="2026-03-17 03:18:00 +0000 UTC" firstStartedPulling="2026-03-17 03:18:01.02615119 +0000 UTC m=+7706.658384168" lastFinishedPulling="2026-03-17 03:18:02.154382827 +0000 UTC m=+7707.786615805" observedRunningTime="2026-03-17 03:18:03.241141656 +0000 UTC m=+7708.873374634" watchObservedRunningTime="2026-03-17 03:18:03.244820955 +0000 UTC m=+7708.877053933" Mar 17 03:18:04 crc kubenswrapper[4735]: I0317 03:18:04.246318 4735 generic.go:334] "Generic (PLEG): container finished" podID="33eb0484-f254-44cc-862e-9b274c0fabbc" containerID="285bdbd71783bbc88ae918daa6d20f2f384a709ac2b7643f42e28448a7b42c6b" exitCode=0 Mar 17 03:18:04 crc kubenswrapper[4735]: I0317 03:18:04.246415 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" event={"ID":"33eb0484-f254-44cc-862e-9b274c0fabbc","Type":"ContainerDied","Data":"285bdbd71783bbc88ae918daa6d20f2f384a709ac2b7643f42e28448a7b42c6b"} Mar 17 03:18:05 crc kubenswrapper[4735]: I0317 03:18:05.641369 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:05 crc kubenswrapper[4735]: I0317 03:18:05.780698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qgr2\" (UniqueName: \"kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2\") pod \"33eb0484-f254-44cc-862e-9b274c0fabbc\" (UID: \"33eb0484-f254-44cc-862e-9b274c0fabbc\") " Mar 17 03:18:05 crc kubenswrapper[4735]: I0317 03:18:05.811808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2" (OuterVolumeSpecName: "kube-api-access-4qgr2") pod "33eb0484-f254-44cc-862e-9b274c0fabbc" (UID: "33eb0484-f254-44cc-862e-9b274c0fabbc"). InnerVolumeSpecName "kube-api-access-4qgr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:18:05 crc kubenswrapper[4735]: I0317 03:18:05.882829 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qgr2\" (UniqueName: \"kubernetes.io/projected/33eb0484-f254-44cc-862e-9b274c0fabbc-kube-api-access-4qgr2\") on node \"crc\" DevicePath \"\"" Mar 17 03:18:06 crc kubenswrapper[4735]: I0317 03:18:06.262750 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" event={"ID":"33eb0484-f254-44cc-862e-9b274c0fabbc","Type":"ContainerDied","Data":"bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2"} Mar 17 03:18:06 crc kubenswrapper[4735]: I0317 03:18:06.262789 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bac1c91b9938d963a557f552e963829f8d8e6b24c5e184a5acc094d0eb16bbd2" Mar 17 03:18:06 crc kubenswrapper[4735]: I0317 03:18:06.262835 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561958-tpr5b" Mar 17 03:18:06 crc kubenswrapper[4735]: I0317 03:18:06.319522 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561952-dcxvk"] Mar 17 03:18:06 crc kubenswrapper[4735]: I0317 03:18:06.332631 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561952-dcxvk"] Mar 17 03:18:07 crc kubenswrapper[4735]: I0317 03:18:07.090159 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a04dd83-8815-41ec-b2d2-eece5171d802" path="/var/lib/kubelet/pods/2a04dd83-8815-41ec-b2d2-eece5171d802/volumes" Mar 17 03:18:12 crc kubenswrapper[4735]: I0317 03:18:12.606286 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:18:12 crc kubenswrapper[4735]: I0317 03:18:12.607018 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.606625 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.607273 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.607335 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.608460 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.608544 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a" gracePeriod=600 Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.776203 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a" exitCode=0 Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.776265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a"} Mar 17 03:18:42 crc kubenswrapper[4735]: I0317 03:18:42.776530 4735 scope.go:117] "RemoveContainer" containerID="1ff044501b96734984b6cbd1554fdc9ff33fada6d7edf31476f6b19847756c40" Mar 17 03:18:43 crc kubenswrapper[4735]: I0317 03:18:43.795902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1"} Mar 17 03:19:06 crc kubenswrapper[4735]: I0317 03:19:06.266635 4735 scope.go:117] "RemoveContainer" containerID="443841550cd14b8b5c2d046a2c8a05af09bbbf36f454a6b30189a4cd6f54f742" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.150677 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561960-rsfrs"] Mar 17 03:20:00 crc kubenswrapper[4735]: E0317 03:20:00.151663 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33eb0484-f254-44cc-862e-9b274c0fabbc" containerName="oc" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.151681 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33eb0484-f254-44cc-862e-9b274c0fabbc" containerName="oc" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.151934 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="33eb0484-f254-44cc-862e-9b274c0fabbc" containerName="oc" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.152650 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.158382 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.158511 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.158794 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.167624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561960-rsfrs"] Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.273365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kj4v\" (UniqueName: \"kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v\") pod \"auto-csr-approver-29561960-rsfrs\" (UID: \"13f2194c-eab4-4989-8709-9c6f8e2c99f4\") " pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.375332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kj4v\" (UniqueName: \"kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v\") pod \"auto-csr-approver-29561960-rsfrs\" (UID: \"13f2194c-eab4-4989-8709-9c6f8e2c99f4\") " pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.398316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kj4v\" (UniqueName: \"kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v\") pod \"auto-csr-approver-29561960-rsfrs\" (UID: \"13f2194c-eab4-4989-8709-9c6f8e2c99f4\") " pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.477338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:00 crc kubenswrapper[4735]: I0317 03:20:00.976276 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561960-rsfrs"] Mar 17 03:20:01 crc kubenswrapper[4735]: I0317 03:20:01.602861 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" event={"ID":"13f2194c-eab4-4989-8709-9c6f8e2c99f4","Type":"ContainerStarted","Data":"0fd7f96ac1696e46b20a070c02683b3ce16647d0d05b55032796de36404f3e40"} Mar 17 03:20:03 crc kubenswrapper[4735]: I0317 03:20:03.626813 4735 generic.go:334] "Generic (PLEG): container finished" podID="13f2194c-eab4-4989-8709-9c6f8e2c99f4" containerID="d5b0a92de13772b86e234a1781179f6b0065b40950798a6df33ec151d58c4af7" exitCode=0 Mar 17 03:20:03 crc kubenswrapper[4735]: I0317 03:20:03.626908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" event={"ID":"13f2194c-eab4-4989-8709-9c6f8e2c99f4","Type":"ContainerDied","Data":"d5b0a92de13772b86e234a1781179f6b0065b40950798a6df33ec151d58c4af7"} Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.095379 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.170874 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kj4v\" (UniqueName: \"kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v\") pod \"13f2194c-eab4-4989-8709-9c6f8e2c99f4\" (UID: \"13f2194c-eab4-4989-8709-9c6f8e2c99f4\") " Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.179315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v" (OuterVolumeSpecName: "kube-api-access-6kj4v") pod "13f2194c-eab4-4989-8709-9c6f8e2c99f4" (UID: "13f2194c-eab4-4989-8709-9c6f8e2c99f4"). InnerVolumeSpecName "kube-api-access-6kj4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.276460 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kj4v\" (UniqueName: \"kubernetes.io/projected/13f2194c-eab4-4989-8709-9c6f8e2c99f4-kube-api-access-6kj4v\") on node \"crc\" DevicePath \"\"" Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.649994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" event={"ID":"13f2194c-eab4-4989-8709-9c6f8e2c99f4","Type":"ContainerDied","Data":"0fd7f96ac1696e46b20a070c02683b3ce16647d0d05b55032796de36404f3e40"} Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.650045 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fd7f96ac1696e46b20a070c02683b3ce16647d0d05b55032796de36404f3e40" Mar 17 03:20:05 crc kubenswrapper[4735]: I0317 03:20:05.650068 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561960-rsfrs" Mar 17 03:20:06 crc kubenswrapper[4735]: I0317 03:20:06.179928 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561954-w8ddm"] Mar 17 03:20:06 crc kubenswrapper[4735]: I0317 03:20:06.192640 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561954-w8ddm"] Mar 17 03:20:07 crc kubenswrapper[4735]: I0317 03:20:07.085345 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f58bf8d-23a6-4437-b77d-c5859fd49afc" path="/var/lib/kubelet/pods/0f58bf8d-23a6-4437-b77d-c5859fd49afc/volumes" Mar 17 03:20:42 crc kubenswrapper[4735]: I0317 03:20:42.607178 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:20:42 crc kubenswrapper[4735]: I0317 03:20:42.607618 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.061134 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:06 crc kubenswrapper[4735]: E0317 03:21:06.062056 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f2194c-eab4-4989-8709-9c6f8e2c99f4" containerName="oc" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.062068 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f2194c-eab4-4989-8709-9c6f8e2c99f4" containerName="oc" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.062276 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f2194c-eab4-4989-8709-9c6f8e2c99f4" containerName="oc" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.063582 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.085292 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.093137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8rh\" (UniqueName: \"kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.093361 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.093387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.195843 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.195933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.196048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8rh\" (UniqueName: \"kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.196461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.197083 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.238652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8rh\" (UniqueName: \"kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh\") pod \"redhat-operators-xx6rt\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.381228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.387650 4735 scope.go:117] "RemoveContainer" containerID="333fab1a50203b7e143c8a66addf28d9292ecd1883c32c6263acf85dfa43f7c4" Mar 17 03:21:06 crc kubenswrapper[4735]: I0317 03:21:06.922338 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:07 crc kubenswrapper[4735]: I0317 03:21:07.255273 4735 generic.go:334] "Generic (PLEG): container finished" podID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerID="c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718" exitCode=0 Mar 17 03:21:07 crc kubenswrapper[4735]: I0317 03:21:07.256612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerDied","Data":"c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718"} Mar 17 03:21:07 crc kubenswrapper[4735]: I0317 03:21:07.256748 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerStarted","Data":"8d64afd9a4c861847f6d1a36aec3fca383fca3faefc356b853006e01f50f5875"} Mar 17 03:21:07 crc kubenswrapper[4735]: I0317 03:21:07.258988 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:21:08 crc kubenswrapper[4735]: I0317 03:21:08.267046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerStarted","Data":"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4"} Mar 17 03:21:12 crc kubenswrapper[4735]: I0317 03:21:12.606635 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:21:12 crc kubenswrapper[4735]: I0317 03:21:12.607206 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:21:13 crc kubenswrapper[4735]: I0317 03:21:13.314177 4735 generic.go:334] "Generic (PLEG): container finished" podID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerID="e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4" exitCode=0 Mar 17 03:21:13 crc kubenswrapper[4735]: I0317 03:21:13.314262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerDied","Data":"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4"} Mar 17 03:21:14 crc kubenswrapper[4735]: I0317 03:21:14.325688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerStarted","Data":"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48"} Mar 17 03:21:14 crc kubenswrapper[4735]: I0317 03:21:14.350405 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xx6rt" podStartSLOduration=1.789368531 podStartE2EDuration="8.350383311s" podCreationTimestamp="2026-03-17 03:21:06 +0000 UTC" firstStartedPulling="2026-03-17 03:21:07.258719083 +0000 UTC m=+7892.890952081" lastFinishedPulling="2026-03-17 03:21:13.819733883 +0000 UTC m=+7899.451966861" observedRunningTime="2026-03-17 03:21:14.343240059 +0000 UTC m=+7899.975473037" watchObservedRunningTime="2026-03-17 03:21:14.350383311 +0000 UTC m=+7899.982616289" Mar 17 03:21:16 crc kubenswrapper[4735]: I0317 03:21:16.381717 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:16 crc kubenswrapper[4735]: I0317 03:21:16.382079 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:17 crc kubenswrapper[4735]: I0317 03:21:17.428653 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xx6rt" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" probeResult="failure" output=< Mar 17 03:21:17 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:21:17 crc kubenswrapper[4735]: > Mar 17 03:21:27 crc kubenswrapper[4735]: I0317 03:21:27.446845 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xx6rt" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" probeResult="failure" output=< Mar 17 03:21:27 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:21:27 crc kubenswrapper[4735]: > Mar 17 03:21:37 crc kubenswrapper[4735]: I0317 03:21:37.438837 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xx6rt" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" probeResult="failure" output=< Mar 17 03:21:37 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:21:37 crc kubenswrapper[4735]: > Mar 17 03:21:42 crc kubenswrapper[4735]: I0317 03:21:42.606700 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:21:42 crc kubenswrapper[4735]: I0317 03:21:42.607245 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:21:42 crc kubenswrapper[4735]: I0317 03:21:42.607286 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:21:42 crc kubenswrapper[4735]: I0317 03:21:42.608904 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:21:42 crc kubenswrapper[4735]: I0317 03:21:42.608981 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" gracePeriod=600 Mar 17 03:21:42 crc kubenswrapper[4735]: E0317 03:21:42.742596 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:21:43 crc kubenswrapper[4735]: I0317 03:21:43.613845 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" exitCode=0 Mar 17 03:21:43 crc kubenswrapper[4735]: I0317 03:21:43.613892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1"} Mar 17 03:21:43 crc kubenswrapper[4735]: I0317 03:21:43.613961 4735 scope.go:117] "RemoveContainer" containerID="d312acc7754de86a75654f421960d6f383d57e7b4013f36754cd39ec70c0a44a" Mar 17 03:21:43 crc kubenswrapper[4735]: I0317 03:21:43.614885 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:21:43 crc kubenswrapper[4735]: E0317 03:21:43.615214 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:21:46 crc kubenswrapper[4735]: I0317 03:21:46.464581 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:46 crc kubenswrapper[4735]: I0317 03:21:46.527294 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:46 crc kubenswrapper[4735]: I0317 03:21:46.711151 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:47 crc kubenswrapper[4735]: I0317 03:21:47.686207 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xx6rt" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" containerID="cri-o://0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48" gracePeriod=2 Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.684388 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.739543 4735 generic.go:334] "Generic (PLEG): container finished" podID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerID="0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48" exitCode=0 Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.739585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerDied","Data":"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48"} Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.739610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xx6rt" event={"ID":"e256a79c-26ed-4b57-a0e8-91d1154abc06","Type":"ContainerDied","Data":"8d64afd9a4c861847f6d1a36aec3fca383fca3faefc356b853006e01f50f5875"} Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.739626 4735 scope.go:117] "RemoveContainer" containerID="0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.739779 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xx6rt" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.764934 4735 scope.go:117] "RemoveContainer" containerID="e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.789144 4735 scope.go:117] "RemoveContainer" containerID="c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.831069 4735 scope.go:117] "RemoveContainer" containerID="0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48" Mar 17 03:21:48 crc kubenswrapper[4735]: E0317 03:21:48.841954 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48\": container with ID starting with 0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48 not found: ID does not exist" containerID="0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.842010 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48"} err="failed to get container status \"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48\": rpc error: code = NotFound desc = could not find container \"0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48\": container with ID starting with 0040381914de244f6368147acfd8136f3c5ef0cb12c9950696e7eca5b2670d48 not found: ID does not exist" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.842036 4735 scope.go:117] "RemoveContainer" containerID="e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.850448 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8rh\" (UniqueName: \"kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh\") pod \"e256a79c-26ed-4b57-a0e8-91d1154abc06\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.850529 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities\") pod \"e256a79c-26ed-4b57-a0e8-91d1154abc06\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.850643 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content\") pod \"e256a79c-26ed-4b57-a0e8-91d1154abc06\" (UID: \"e256a79c-26ed-4b57-a0e8-91d1154abc06\") " Mar 17 03:21:48 crc kubenswrapper[4735]: E0317 03:21:48.850773 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4\": container with ID starting with e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4 not found: ID does not exist" containerID="e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.850809 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4"} err="failed to get container status \"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4\": rpc error: code = NotFound desc = could not find container \"e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4\": container with ID starting with e180b2b1f8ee751c692d479eb783f45090846ba477733b969938187ef90ebcc4 not found: ID does not exist" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.850831 4735 scope.go:117] "RemoveContainer" containerID="c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718" Mar 17 03:21:48 crc kubenswrapper[4735]: E0317 03:21:48.853031 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718\": container with ID starting with c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718 not found: ID does not exist" containerID="c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.853067 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718"} err="failed to get container status \"c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718\": rpc error: code = NotFound desc = could not find container \"c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718\": container with ID starting with c89b561b152112680907d5809a117cd267852d6f0e522434749c96807619a718 not found: ID does not exist" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.853373 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities" (OuterVolumeSpecName: "utilities") pod "e256a79c-26ed-4b57-a0e8-91d1154abc06" (UID: "e256a79c-26ed-4b57-a0e8-91d1154abc06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.867850 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh" (OuterVolumeSpecName: "kube-api-access-dq8rh") pod "e256a79c-26ed-4b57-a0e8-91d1154abc06" (UID: "e256a79c-26ed-4b57-a0e8-91d1154abc06"). InnerVolumeSpecName "kube-api-access-dq8rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.953812 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8rh\" (UniqueName: \"kubernetes.io/projected/e256a79c-26ed-4b57-a0e8-91d1154abc06-kube-api-access-dq8rh\") on node \"crc\" DevicePath \"\"" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.953844 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:21:48 crc kubenswrapper[4735]: I0317 03:21:48.961241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e256a79c-26ed-4b57-a0e8-91d1154abc06" (UID: "e256a79c-26ed-4b57-a0e8-91d1154abc06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:21:49 crc kubenswrapper[4735]: I0317 03:21:49.055391 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e256a79c-26ed-4b57-a0e8-91d1154abc06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:21:49 crc kubenswrapper[4735]: I0317 03:21:49.068747 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:49 crc kubenswrapper[4735]: I0317 03:21:49.088251 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xx6rt"] Mar 17 03:21:51 crc kubenswrapper[4735]: I0317 03:21:51.087384 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" path="/var/lib/kubelet/pods/e256a79c-26ed-4b57-a0e8-91d1154abc06/volumes" Mar 17 03:21:55 crc kubenswrapper[4735]: I0317 03:21:55.084676 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:21:55 crc kubenswrapper[4735]: E0317 03:21:55.085730 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.154250 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561962-nzj7s"] Mar 17 03:22:00 crc kubenswrapper[4735]: E0317 03:22:00.155151 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="extract-utilities" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.155171 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="extract-utilities" Mar 17 03:22:00 crc kubenswrapper[4735]: E0317 03:22:00.155191 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.155197 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" Mar 17 03:22:00 crc kubenswrapper[4735]: E0317 03:22:00.155207 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="extract-content" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.155213 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="extract-content" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.155414 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e256a79c-26ed-4b57-a0e8-91d1154abc06" containerName="registry-server" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.156043 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.160779 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.161003 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.163385 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561962-nzj7s"] Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.165570 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.193579 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k6zb\" (UniqueName: \"kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb\") pod \"auto-csr-approver-29561962-nzj7s\" (UID: \"eb5d427f-5745-45ff-9e82-75a34b12da44\") " pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.294525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k6zb\" (UniqueName: \"kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb\") pod \"auto-csr-approver-29561962-nzj7s\" (UID: \"eb5d427f-5745-45ff-9e82-75a34b12da44\") " pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.315533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k6zb\" (UniqueName: \"kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb\") pod \"auto-csr-approver-29561962-nzj7s\" (UID: \"eb5d427f-5745-45ff-9e82-75a34b12da44\") " pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:00 crc kubenswrapper[4735]: I0317 03:22:00.477682 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:01 crc kubenswrapper[4735]: I0317 03:22:01.053501 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561962-nzj7s"] Mar 17 03:22:01 crc kubenswrapper[4735]: W0317 03:22:01.058019 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb5d427f_5745_45ff_9e82_75a34b12da44.slice/crio-6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05 WatchSource:0}: Error finding container 6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05: Status 404 returned error can't find the container with id 6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05 Mar 17 03:22:01 crc kubenswrapper[4735]: I0317 03:22:01.910036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" event={"ID":"eb5d427f-5745-45ff-9e82-75a34b12da44","Type":"ContainerStarted","Data":"6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05"} Mar 17 03:22:02 crc kubenswrapper[4735]: I0317 03:22:02.919625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" event={"ID":"eb5d427f-5745-45ff-9e82-75a34b12da44","Type":"ContainerStarted","Data":"827682ca782a56404b054affbc8c4de0958f1584861c9c089d744a8c780f57cd"} Mar 17 03:22:02 crc kubenswrapper[4735]: I0317 03:22:02.943491 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" podStartSLOduration=1.8960653760000001 podStartE2EDuration="2.943466349s" podCreationTimestamp="2026-03-17 03:22:00 +0000 UTC" firstStartedPulling="2026-03-17 03:22:01.061145526 +0000 UTC m=+7946.693378504" lastFinishedPulling="2026-03-17 03:22:02.108546469 +0000 UTC m=+7947.740779477" observedRunningTime="2026-03-17 03:22:02.935250501 +0000 UTC m=+7948.567483479" watchObservedRunningTime="2026-03-17 03:22:02.943466349 +0000 UTC m=+7948.575699357" Mar 17 03:22:03 crc kubenswrapper[4735]: I0317 03:22:03.931340 4735 generic.go:334] "Generic (PLEG): container finished" podID="eb5d427f-5745-45ff-9e82-75a34b12da44" containerID="827682ca782a56404b054affbc8c4de0958f1584861c9c089d744a8c780f57cd" exitCode=0 Mar 17 03:22:03 crc kubenswrapper[4735]: I0317 03:22:03.931462 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" event={"ID":"eb5d427f-5745-45ff-9e82-75a34b12da44","Type":"ContainerDied","Data":"827682ca782a56404b054affbc8c4de0958f1584861c9c089d744a8c780f57cd"} Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.325791 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.398852 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k6zb\" (UniqueName: \"kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb\") pod \"eb5d427f-5745-45ff-9e82-75a34b12da44\" (UID: \"eb5d427f-5745-45ff-9e82-75a34b12da44\") " Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.405646 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb" (OuterVolumeSpecName: "kube-api-access-7k6zb") pod "eb5d427f-5745-45ff-9e82-75a34b12da44" (UID: "eb5d427f-5745-45ff-9e82-75a34b12da44"). InnerVolumeSpecName "kube-api-access-7k6zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.503149 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k6zb\" (UniqueName: \"kubernetes.io/projected/eb5d427f-5745-45ff-9e82-75a34b12da44-kube-api-access-7k6zb\") on node \"crc\" DevicePath \"\"" Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.958110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" event={"ID":"eb5d427f-5745-45ff-9e82-75a34b12da44","Type":"ContainerDied","Data":"6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05"} Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.958159 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561962-nzj7s" Mar 17 03:22:05 crc kubenswrapper[4735]: I0317 03:22:05.958168 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d39041d3525094fb51bf8c753513f5015c5641ddb5752eebc9e9d0461276b05" Mar 17 03:22:06 crc kubenswrapper[4735]: I0317 03:22:06.033189 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561956-kl52k"] Mar 17 03:22:06 crc kubenswrapper[4735]: I0317 03:22:06.044277 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561956-kl52k"] Mar 17 03:22:06 crc kubenswrapper[4735]: I0317 03:22:06.075207 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:22:06 crc kubenswrapper[4735]: E0317 03:22:06.075459 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:22:06 crc kubenswrapper[4735]: I0317 03:22:06.466464 4735 scope.go:117] "RemoveContainer" containerID="823f8951a9235e8157a306ae9aef31af292117516bda2e9f620bdcf2386a20a2" Mar 17 03:22:07 crc kubenswrapper[4735]: I0317 03:22:07.093202 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e806fb2d-451a-45dc-9a4d-06798f4d7f6c" path="/var/lib/kubelet/pods/e806fb2d-451a-45dc-9a4d-06798f4d7f6c/volumes" Mar 17 03:22:17 crc kubenswrapper[4735]: I0317 03:22:17.074223 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:22:17 crc kubenswrapper[4735]: E0317 03:22:17.075256 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:22:31 crc kubenswrapper[4735]: I0317 03:22:31.073445 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:22:31 crc kubenswrapper[4735]: E0317 03:22:31.076006 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:22:42 crc kubenswrapper[4735]: I0317 03:22:42.073700 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:22:42 crc kubenswrapper[4735]: E0317 03:22:42.076576 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:22:53 crc kubenswrapper[4735]: I0317 03:22:53.079570 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:22:53 crc kubenswrapper[4735]: E0317 03:22:53.080564 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:23:06 crc kubenswrapper[4735]: I0317 03:23:06.555997 4735 scope.go:117] "RemoveContainer" containerID="99c9c5e57eabbc22fb4687621a2b5105e2a795e1147f9c9720fe393f2dfd71fe" Mar 17 03:23:06 crc kubenswrapper[4735]: I0317 03:23:06.615309 4735 scope.go:117] "RemoveContainer" containerID="2a958e631d83fecfebd5e9e5d3aa2be460561a6584e9b4ae561b5d6c76f8b838" Mar 17 03:23:06 crc kubenswrapper[4735]: I0317 03:23:06.696631 4735 scope.go:117] "RemoveContainer" containerID="d6e117b77af9517b5d14d38ef1b90b88a0d1685faab02dbe743107a18a454101" Mar 17 03:23:07 crc kubenswrapper[4735]: I0317 03:23:07.073452 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:23:07 crc kubenswrapper[4735]: E0317 03:23:07.073717 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.852493 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:14 crc kubenswrapper[4735]: E0317 03:23:14.853267 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5d427f-5745-45ff-9e82-75a34b12da44" containerName="oc" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.853279 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5d427f-5745-45ff-9e82-75a34b12da44" containerName="oc" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.853457 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5d427f-5745-45ff-9e82-75a34b12da44" containerName="oc" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.854698 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.891888 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.968663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbq9\" (UniqueName: \"kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.968760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:14 crc kubenswrapper[4735]: I0317 03:23:14.968953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.070626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbq9\" (UniqueName: \"kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.070769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.070834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.071465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.071495 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.108595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbq9\" (UniqueName: \"kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9\") pod \"community-operators-h5zzp\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:15 crc kubenswrapper[4735]: I0317 03:23:15.187289 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:16 crc kubenswrapper[4735]: I0317 03:23:16.497009 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:16 crc kubenswrapper[4735]: I0317 03:23:16.659489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerStarted","Data":"4d24d46de8f8ee3152239adff616e0dc25edb361444e827c9b8439322e25b369"} Mar 17 03:23:17 crc kubenswrapper[4735]: I0317 03:23:17.672260 4735 generic.go:334] "Generic (PLEG): container finished" podID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerID="33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da" exitCode=0 Mar 17 03:23:17 crc kubenswrapper[4735]: I0317 03:23:17.672383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerDied","Data":"33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da"} Mar 17 03:23:18 crc kubenswrapper[4735]: I0317 03:23:18.073376 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:23:18 crc kubenswrapper[4735]: E0317 03:23:18.074191 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:23:18 crc kubenswrapper[4735]: I0317 03:23:18.726177 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerStarted","Data":"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188"} Mar 17 03:23:20 crc kubenswrapper[4735]: I0317 03:23:20.757426 4735 generic.go:334] "Generic (PLEG): container finished" podID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerID="754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188" exitCode=0 Mar 17 03:23:20 crc kubenswrapper[4735]: I0317 03:23:20.757967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerDied","Data":"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188"} Mar 17 03:23:21 crc kubenswrapper[4735]: I0317 03:23:21.775622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerStarted","Data":"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0"} Mar 17 03:23:21 crc kubenswrapper[4735]: I0317 03:23:21.803779 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5zzp" podStartSLOduration=4.296168121 podStartE2EDuration="7.803750219s" podCreationTimestamp="2026-03-17 03:23:14 +0000 UTC" firstStartedPulling="2026-03-17 03:23:17.674736238 +0000 UTC m=+8023.306969216" lastFinishedPulling="2026-03-17 03:23:21.182318336 +0000 UTC m=+8026.814551314" observedRunningTime="2026-03-17 03:23:21.800897161 +0000 UTC m=+8027.433130169" watchObservedRunningTime="2026-03-17 03:23:21.803750219 +0000 UTC m=+8027.435983227" Mar 17 03:23:25 crc kubenswrapper[4735]: I0317 03:23:25.190252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:25 crc kubenswrapper[4735]: I0317 03:23:25.190711 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:26 crc kubenswrapper[4735]: I0317 03:23:26.242841 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-h5zzp" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="registry-server" probeResult="failure" output=< Mar 17 03:23:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:23:26 crc kubenswrapper[4735]: > Mar 17 03:23:31 crc kubenswrapper[4735]: I0317 03:23:31.074276 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:23:31 crc kubenswrapper[4735]: E0317 03:23:31.075134 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:23:35 crc kubenswrapper[4735]: I0317 03:23:35.245628 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:35 crc kubenswrapper[4735]: I0317 03:23:35.316034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:35 crc kubenswrapper[4735]: I0317 03:23:35.492973 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:36 crc kubenswrapper[4735]: I0317 03:23:36.937086 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5zzp" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="registry-server" containerID="cri-o://910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0" gracePeriod=2 Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.476833 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.584241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content\") pod \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.584613 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities\") pod \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.584825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlbq9\" (UniqueName: \"kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9\") pod \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\" (UID: \"57dd5103-f711-4120-a94e-5ce3b9b56f1e\") " Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.585413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities" (OuterVolumeSpecName: "utilities") pod "57dd5103-f711-4120-a94e-5ce3b9b56f1e" (UID: "57dd5103-f711-4120-a94e-5ce3b9b56f1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.585942 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.598219 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9" (OuterVolumeSpecName: "kube-api-access-nlbq9") pod "57dd5103-f711-4120-a94e-5ce3b9b56f1e" (UID: "57dd5103-f711-4120-a94e-5ce3b9b56f1e"). InnerVolumeSpecName "kube-api-access-nlbq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.660316 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57dd5103-f711-4120-a94e-5ce3b9b56f1e" (UID: "57dd5103-f711-4120-a94e-5ce3b9b56f1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.689074 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57dd5103-f711-4120-a94e-5ce3b9b56f1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.690014 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlbq9\" (UniqueName: \"kubernetes.io/projected/57dd5103-f711-4120-a94e-5ce3b9b56f1e-kube-api-access-nlbq9\") on node \"crc\" DevicePath \"\"" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.951152 4735 generic.go:334] "Generic (PLEG): container finished" podID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerID="910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0" exitCode=0 Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.951201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerDied","Data":"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0"} Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.951234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5zzp" event={"ID":"57dd5103-f711-4120-a94e-5ce3b9b56f1e","Type":"ContainerDied","Data":"4d24d46de8f8ee3152239adff616e0dc25edb361444e827c9b8439322e25b369"} Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.951251 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5zzp" Mar 17 03:23:37 crc kubenswrapper[4735]: I0317 03:23:37.951274 4735 scope.go:117] "RemoveContainer" containerID="910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.003984 4735 scope.go:117] "RemoveContainer" containerID="754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.032355 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.045183 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5zzp"] Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.045851 4735 scope.go:117] "RemoveContainer" containerID="33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.099633 4735 scope.go:117] "RemoveContainer" containerID="910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0" Mar 17 03:23:38 crc kubenswrapper[4735]: E0317 03:23:38.100238 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0\": container with ID starting with 910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0 not found: ID does not exist" containerID="910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.100272 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0"} err="failed to get container status \"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0\": rpc error: code = NotFound desc = could not find container \"910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0\": container with ID starting with 910d962b04909ac9a15790461f16bf5612f826351f194c0e71841c61204c99e0 not found: ID does not exist" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.100298 4735 scope.go:117] "RemoveContainer" containerID="754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188" Mar 17 03:23:38 crc kubenswrapper[4735]: E0317 03:23:38.102037 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188\": container with ID starting with 754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188 not found: ID does not exist" containerID="754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.102108 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188"} err="failed to get container status \"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188\": rpc error: code = NotFound desc = could not find container \"754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188\": container with ID starting with 754c1bcf1f2e535b25af31a0c2da50bcc8f012093ebb03fa10be4dd0bcf99188 not found: ID does not exist" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.102152 4735 scope.go:117] "RemoveContainer" containerID="33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da" Mar 17 03:23:38 crc kubenswrapper[4735]: E0317 03:23:38.102794 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da\": container with ID starting with 33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da not found: ID does not exist" containerID="33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da" Mar 17 03:23:38 crc kubenswrapper[4735]: I0317 03:23:38.102828 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da"} err="failed to get container status \"33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da\": rpc error: code = NotFound desc = could not find container \"33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da\": container with ID starting with 33a90ec66d61a57e0dd0fa51cad451b873984448d95291bac76eff61643210da not found: ID does not exist" Mar 17 03:23:39 crc kubenswrapper[4735]: I0317 03:23:39.094634 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" path="/var/lib/kubelet/pods/57dd5103-f711-4120-a94e-5ce3b9b56f1e/volumes" Mar 17 03:23:46 crc kubenswrapper[4735]: I0317 03:23:46.074241 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:23:46 crc kubenswrapper[4735]: E0317 03:23:46.075204 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.074231 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:24:00 crc kubenswrapper[4735]: E0317 03:24:00.075250 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.166797 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561964-kmhgl"] Mar 17 03:24:00 crc kubenswrapper[4735]: E0317 03:24:00.167251 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="extract-content" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.167267 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="extract-content" Mar 17 03:24:00 crc kubenswrapper[4735]: E0317 03:24:00.167289 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="extract-utilities" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.167296 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="extract-utilities" Mar 17 03:24:00 crc kubenswrapper[4735]: E0317 03:24:00.167313 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="registry-server" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.167319 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="registry-server" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.167511 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dd5103-f711-4120-a94e-5ce3b9b56f1e" containerName="registry-server" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.168178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.171370 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.174483 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.175438 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.178881 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561964-kmhgl"] Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.306124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlnhz\" (UniqueName: \"kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz\") pod \"auto-csr-approver-29561964-kmhgl\" (UID: \"68eaa4d9-9f62-40f1-8718-1a046df3cc4c\") " pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.408793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlnhz\" (UniqueName: \"kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz\") pod \"auto-csr-approver-29561964-kmhgl\" (UID: \"68eaa4d9-9f62-40f1-8718-1a046df3cc4c\") " pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.432001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlnhz\" (UniqueName: \"kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz\") pod \"auto-csr-approver-29561964-kmhgl\" (UID: \"68eaa4d9-9f62-40f1-8718-1a046df3cc4c\") " pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.497158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:00 crc kubenswrapper[4735]: W0317 03:24:00.989976 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68eaa4d9_9f62_40f1_8718_1a046df3cc4c.slice/crio-be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9 WatchSource:0}: Error finding container be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9: Status 404 returned error can't find the container with id be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9 Mar 17 03:24:00 crc kubenswrapper[4735]: I0317 03:24:00.991430 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561964-kmhgl"] Mar 17 03:24:01 crc kubenswrapper[4735]: I0317 03:24:01.209790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" event={"ID":"68eaa4d9-9f62-40f1-8718-1a046df3cc4c","Type":"ContainerStarted","Data":"be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9"} Mar 17 03:24:03 crc kubenswrapper[4735]: I0317 03:24:03.230240 4735 generic.go:334] "Generic (PLEG): container finished" podID="68eaa4d9-9f62-40f1-8718-1a046df3cc4c" containerID="88b3e4d0d2daee786a639d7bbee7fbeda066e8a45c00538ec664dcb8b839fd48" exitCode=0 Mar 17 03:24:03 crc kubenswrapper[4735]: I0317 03:24:03.230303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" event={"ID":"68eaa4d9-9f62-40f1-8718-1a046df3cc4c","Type":"ContainerDied","Data":"88b3e4d0d2daee786a639d7bbee7fbeda066e8a45c00538ec664dcb8b839fd48"} Mar 17 03:24:04 crc kubenswrapper[4735]: I0317 03:24:04.569150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:04 crc kubenswrapper[4735]: I0317 03:24:04.695268 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlnhz\" (UniqueName: \"kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz\") pod \"68eaa4d9-9f62-40f1-8718-1a046df3cc4c\" (UID: \"68eaa4d9-9f62-40f1-8718-1a046df3cc4c\") " Mar 17 03:24:04 crc kubenswrapper[4735]: I0317 03:24:04.700956 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz" (OuterVolumeSpecName: "kube-api-access-dlnhz") pod "68eaa4d9-9f62-40f1-8718-1a046df3cc4c" (UID: "68eaa4d9-9f62-40f1-8718-1a046df3cc4c"). InnerVolumeSpecName "kube-api-access-dlnhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:24:04 crc kubenswrapper[4735]: I0317 03:24:04.797461 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlnhz\" (UniqueName: \"kubernetes.io/projected/68eaa4d9-9f62-40f1-8718-1a046df3cc4c-kube-api-access-dlnhz\") on node \"crc\" DevicePath \"\"" Mar 17 03:24:05 crc kubenswrapper[4735]: I0317 03:24:05.247142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" event={"ID":"68eaa4d9-9f62-40f1-8718-1a046df3cc4c","Type":"ContainerDied","Data":"be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9"} Mar 17 03:24:05 crc kubenswrapper[4735]: I0317 03:24:05.247186 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be1f80e1908863d18264556ec0e3a85b8714ccf3f25c40c4a905342782a736f9" Mar 17 03:24:05 crc kubenswrapper[4735]: I0317 03:24:05.247222 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561964-kmhgl" Mar 17 03:24:05 crc kubenswrapper[4735]: I0317 03:24:05.650115 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561958-tpr5b"] Mar 17 03:24:05 crc kubenswrapper[4735]: I0317 03:24:05.656974 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561958-tpr5b"] Mar 17 03:24:07 crc kubenswrapper[4735]: I0317 03:24:07.087362 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33eb0484-f254-44cc-862e-9b274c0fabbc" path="/var/lib/kubelet/pods/33eb0484-f254-44cc-862e-9b274c0fabbc/volumes" Mar 17 03:24:12 crc kubenswrapper[4735]: I0317 03:24:12.073457 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:24:12 crc kubenswrapper[4735]: E0317 03:24:12.074579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:24:25 crc kubenswrapper[4735]: I0317 03:24:25.081395 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:24:25 crc kubenswrapper[4735]: E0317 03:24:25.082066 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:24:29 crc kubenswrapper[4735]: E0317 03:24:29.303969 4735 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.65:47932->38.102.83.65:40841: read tcp 38.102.83.65:47932->38.102.83.65:40841: read: connection reset by peer Mar 17 03:24:40 crc kubenswrapper[4735]: I0317 03:24:40.075125 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:24:40 crc kubenswrapper[4735]: E0317 03:24:40.076162 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:24:51 crc kubenswrapper[4735]: I0317 03:24:51.073556 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:24:51 crc kubenswrapper[4735]: E0317 03:24:51.074327 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:25:05 crc kubenswrapper[4735]: I0317 03:25:05.081699 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:25:05 crc kubenswrapper[4735]: E0317 03:25:05.082646 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:25:06 crc kubenswrapper[4735]: I0317 03:25:06.830433 4735 scope.go:117] "RemoveContainer" containerID="285bdbd71783bbc88ae918daa6d20f2f384a709ac2b7643f42e28448a7b42c6b" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.544026 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:09 crc kubenswrapper[4735]: E0317 03:25:09.545322 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68eaa4d9-9f62-40f1-8718-1a046df3cc4c" containerName="oc" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.545350 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="68eaa4d9-9f62-40f1-8718-1a046df3cc4c" containerName="oc" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.545685 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="68eaa4d9-9f62-40f1-8718-1a046df3cc4c" containerName="oc" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.549538 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.573161 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.687450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.687539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.687591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjzm\" (UniqueName: \"kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.789009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.789081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.789133 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjzm\" (UniqueName: \"kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.789668 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.789707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.822987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjzm\" (UniqueName: \"kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm\") pod \"redhat-marketplace-w69fv\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:09 crc kubenswrapper[4735]: I0317 03:25:09.885583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:10 crc kubenswrapper[4735]: I0317 03:25:10.408524 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:11 crc kubenswrapper[4735]: I0317 03:25:11.013096 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerID="9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c" exitCode=0 Mar 17 03:25:11 crc kubenswrapper[4735]: I0317 03:25:11.013178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerDied","Data":"9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c"} Mar 17 03:25:11 crc kubenswrapper[4735]: I0317 03:25:11.013261 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerStarted","Data":"02644d412e3bb0d08003395dfc5daf66c6954044bfe9ba765100a8bf741d9184"} Mar 17 03:25:12 crc kubenswrapper[4735]: I0317 03:25:12.024693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerStarted","Data":"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c"} Mar 17 03:25:14 crc kubenswrapper[4735]: I0317 03:25:14.043236 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerID="6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c" exitCode=0 Mar 17 03:25:14 crc kubenswrapper[4735]: I0317 03:25:14.043285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerDied","Data":"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c"} Mar 17 03:25:15 crc kubenswrapper[4735]: I0317 03:25:15.054394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerStarted","Data":"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560"} Mar 17 03:25:15 crc kubenswrapper[4735]: I0317 03:25:15.091343 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w69fv" podStartSLOduration=2.697495648 podStartE2EDuration="6.0913231s" podCreationTimestamp="2026-03-17 03:25:09 +0000 UTC" firstStartedPulling="2026-03-17 03:25:11.019057155 +0000 UTC m=+8136.651290133" lastFinishedPulling="2026-03-17 03:25:14.412884587 +0000 UTC m=+8140.045117585" observedRunningTime="2026-03-17 03:25:15.076425153 +0000 UTC m=+8140.708658171" watchObservedRunningTime="2026-03-17 03:25:15.0913231 +0000 UTC m=+8140.723556078" Mar 17 03:25:19 crc kubenswrapper[4735]: I0317 03:25:19.073477 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:25:19 crc kubenswrapper[4735]: E0317 03:25:19.074491 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:25:19 crc kubenswrapper[4735]: I0317 03:25:19.885759 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:19 crc kubenswrapper[4735]: I0317 03:25:19.886031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:20 crc kubenswrapper[4735]: I0317 03:25:20.936141 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-w69fv" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="registry-server" probeResult="failure" output=< Mar 17 03:25:20 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:25:20 crc kubenswrapper[4735]: > Mar 17 03:25:29 crc kubenswrapper[4735]: I0317 03:25:29.949166 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:30 crc kubenswrapper[4735]: I0317 03:25:30.007441 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:30 crc kubenswrapper[4735]: I0317 03:25:30.183307 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.073325 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:25:31 crc kubenswrapper[4735]: E0317 03:25:31.074107 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.220607 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w69fv" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="registry-server" containerID="cri-o://e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560" gracePeriod=2 Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.817816 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.972489 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjzm\" (UniqueName: \"kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm\") pod \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.972558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities\") pod \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.972644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content\") pod \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\" (UID: \"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2\") " Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.973709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities" (OuterVolumeSpecName: "utilities") pod "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" (UID: "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.984673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm" (OuterVolumeSpecName: "kube-api-access-ppjzm") pod "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" (UID: "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2"). InnerVolumeSpecName "kube-api-access-ppjzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:25:31 crc kubenswrapper[4735]: I0317 03:25:31.997707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" (UID: "1d6eb608-dc9d-4d8c-af64-4ac71f794bb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.074816 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.074847 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjzm\" (UniqueName: \"kubernetes.io/projected/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-kube-api-access-ppjzm\") on node \"crc\" DevicePath \"\"" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.074874 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.233290 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerID="e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560" exitCode=0 Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.233348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerDied","Data":"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560"} Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.233362 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w69fv" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.233378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w69fv" event={"ID":"1d6eb608-dc9d-4d8c-af64-4ac71f794bb2","Type":"ContainerDied","Data":"02644d412e3bb0d08003395dfc5daf66c6954044bfe9ba765100a8bf741d9184"} Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.233396 4735 scope.go:117] "RemoveContainer" containerID="e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.265392 4735 scope.go:117] "RemoveContainer" containerID="6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.279945 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.289609 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w69fv"] Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.290588 4735 scope.go:117] "RemoveContainer" containerID="9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.347119 4735 scope.go:117] "RemoveContainer" containerID="e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560" Mar 17 03:25:32 crc kubenswrapper[4735]: E0317 03:25:32.348083 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560\": container with ID starting with e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560 not found: ID does not exist" containerID="e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.348121 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560"} err="failed to get container status \"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560\": rpc error: code = NotFound desc = could not find container \"e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560\": container with ID starting with e997a1f48de92be16a6855b4416f7ccd55e9400e7de322ba3f20741e911e1560 not found: ID does not exist" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.348176 4735 scope.go:117] "RemoveContainer" containerID="6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c" Mar 17 03:25:32 crc kubenswrapper[4735]: E0317 03:25:32.348984 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c\": container with ID starting with 6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c not found: ID does not exist" containerID="6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.349075 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c"} err="failed to get container status \"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c\": rpc error: code = NotFound desc = could not find container \"6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c\": container with ID starting with 6c9f362d5e6f19a1a1b12227edc562b7de68369aa18cd7191562c98f82f8931c not found: ID does not exist" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.349127 4735 scope.go:117] "RemoveContainer" containerID="9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c" Mar 17 03:25:32 crc kubenswrapper[4735]: E0317 03:25:32.349797 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c\": container with ID starting with 9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c not found: ID does not exist" containerID="9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c" Mar 17 03:25:32 crc kubenswrapper[4735]: I0317 03:25:32.349827 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c"} err="failed to get container status \"9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c\": rpc error: code = NotFound desc = could not find container \"9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c\": container with ID starting with 9e8900cea7b75974085edce9f5a4a71f7a08aba02708a77fad7cf4706ecdda6c not found: ID does not exist" Mar 17 03:25:33 crc kubenswrapper[4735]: I0317 03:25:33.096560 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" path="/var/lib/kubelet/pods/1d6eb608-dc9d-4d8c-af64-4ac71f794bb2/volumes" Mar 17 03:25:46 crc kubenswrapper[4735]: I0317 03:25:46.073785 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:25:46 crc kubenswrapper[4735]: E0317 03:25:46.074612 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.074854 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:26:00 crc kubenswrapper[4735]: E0317 03:26:00.076042 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.162110 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561966-8wptr"] Mar 17 03:26:00 crc kubenswrapper[4735]: E0317 03:26:00.162689 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="extract-content" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.162739 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="extract-content" Mar 17 03:26:00 crc kubenswrapper[4735]: E0317 03:26:00.162782 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="extract-utilities" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.162795 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="extract-utilities" Mar 17 03:26:00 crc kubenswrapper[4735]: E0317 03:26:00.162819 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="registry-server" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.162831 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="registry-server" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.163171 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6eb608-dc9d-4d8c-af64-4ac71f794bb2" containerName="registry-server" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.164928 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.174694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.174816 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.174895 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561966-8wptr"] Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.175788 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.280264 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mxx\" (UniqueName: \"kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx\") pod \"auto-csr-approver-29561966-8wptr\" (UID: \"25442a6d-8495-493d-a41b-b4bb330d55c7\") " pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.382306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mxx\" (UniqueName: \"kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx\") pod \"auto-csr-approver-29561966-8wptr\" (UID: \"25442a6d-8495-493d-a41b-b4bb330d55c7\") " pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.406428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mxx\" (UniqueName: \"kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx\") pod \"auto-csr-approver-29561966-8wptr\" (UID: \"25442a6d-8495-493d-a41b-b4bb330d55c7\") " pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:00 crc kubenswrapper[4735]: I0317 03:26:00.494040 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:01 crc kubenswrapper[4735]: I0317 03:26:01.014117 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561966-8wptr"] Mar 17 03:26:01 crc kubenswrapper[4735]: I0317 03:26:01.512800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561966-8wptr" event={"ID":"25442a6d-8495-493d-a41b-b4bb330d55c7","Type":"ContainerStarted","Data":"f5d4f6bf7b7ee02a042205f7a4b1c77c7b4896fc39d9ecf2af7f81fed0bd5d3e"} Mar 17 03:26:02 crc kubenswrapper[4735]: I0317 03:26:02.522722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561966-8wptr" event={"ID":"25442a6d-8495-493d-a41b-b4bb330d55c7","Type":"ContainerStarted","Data":"53129f67dfe8b508478f008111f73c4d07466f2c3c0f0b751d2daca8f6413b99"} Mar 17 03:26:02 crc kubenswrapper[4735]: I0317 03:26:02.542339 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561966-8wptr" podStartSLOduration=1.458811907 podStartE2EDuration="2.542316028s" podCreationTimestamp="2026-03-17 03:26:00 +0000 UTC" firstStartedPulling="2026-03-17 03:26:01.014600688 +0000 UTC m=+8186.646833706" lastFinishedPulling="2026-03-17 03:26:02.098104819 +0000 UTC m=+8187.730337827" observedRunningTime="2026-03-17 03:26:02.534478089 +0000 UTC m=+8188.166711067" watchObservedRunningTime="2026-03-17 03:26:02.542316028 +0000 UTC m=+8188.174549026" Mar 17 03:26:04 crc kubenswrapper[4735]: I0317 03:26:04.554870 4735 generic.go:334] "Generic (PLEG): container finished" podID="25442a6d-8495-493d-a41b-b4bb330d55c7" containerID="53129f67dfe8b508478f008111f73c4d07466f2c3c0f0b751d2daca8f6413b99" exitCode=0 Mar 17 03:26:04 crc kubenswrapper[4735]: I0317 03:26:04.554896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561966-8wptr" event={"ID":"25442a6d-8495-493d-a41b-b4bb330d55c7","Type":"ContainerDied","Data":"53129f67dfe8b508478f008111f73c4d07466f2c3c0f0b751d2daca8f6413b99"} Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.025900 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.198972 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mxx\" (UniqueName: \"kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx\") pod \"25442a6d-8495-493d-a41b-b4bb330d55c7\" (UID: \"25442a6d-8495-493d-a41b-b4bb330d55c7\") " Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.205563 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx" (OuterVolumeSpecName: "kube-api-access-88mxx") pod "25442a6d-8495-493d-a41b-b4bb330d55c7" (UID: "25442a6d-8495-493d-a41b-b4bb330d55c7"). InnerVolumeSpecName "kube-api-access-88mxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.302340 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mxx\" (UniqueName: \"kubernetes.io/projected/25442a6d-8495-493d-a41b-b4bb330d55c7-kube-api-access-88mxx\") on node \"crc\" DevicePath \"\"" Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.578302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561966-8wptr" event={"ID":"25442a6d-8495-493d-a41b-b4bb330d55c7","Type":"ContainerDied","Data":"f5d4f6bf7b7ee02a042205f7a4b1c77c7b4896fc39d9ecf2af7f81fed0bd5d3e"} Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.578611 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d4f6bf7b7ee02a042205f7a4b1c77c7b4896fc39d9ecf2af7f81fed0bd5d3e" Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.578384 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561966-8wptr" Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.655067 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561960-rsfrs"] Mar 17 03:26:06 crc kubenswrapper[4735]: I0317 03:26:06.664711 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561960-rsfrs"] Mar 17 03:26:07 crc kubenswrapper[4735]: I0317 03:26:07.086638 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f2194c-eab4-4989-8709-9c6f8e2c99f4" path="/var/lib/kubelet/pods/13f2194c-eab4-4989-8709-9c6f8e2c99f4/volumes" Mar 17 03:26:13 crc kubenswrapper[4735]: I0317 03:26:13.073315 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:26:13 crc kubenswrapper[4735]: E0317 03:26:13.078280 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:26:27 crc kubenswrapper[4735]: I0317 03:26:27.077210 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:26:27 crc kubenswrapper[4735]: E0317 03:26:27.078309 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:26:29 crc kubenswrapper[4735]: I0317 03:26:29.979843 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:29 crc kubenswrapper[4735]: E0317 03:26:29.981508 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25442a6d-8495-493d-a41b-b4bb330d55c7" containerName="oc" Mar 17 03:26:29 crc kubenswrapper[4735]: I0317 03:26:29.981575 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="25442a6d-8495-493d-a41b-b4bb330d55c7" containerName="oc" Mar 17 03:26:29 crc kubenswrapper[4735]: I0317 03:26:29.981805 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="25442a6d-8495-493d-a41b-b4bb330d55c7" containerName="oc" Mar 17 03:26:29 crc kubenswrapper[4735]: I0317 03:26:29.983273 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.007190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.077772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.077831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.077968 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bpb\" (UniqueName: \"kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.179659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bpb\" (UniqueName: \"kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.179809 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.180223 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.180689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.180661 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.205247 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bpb\" (UniqueName: \"kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb\") pod \"certified-operators-zbtrt\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.303118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:30 crc kubenswrapper[4735]: I0317 03:26:30.840902 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:31 crc kubenswrapper[4735]: I0317 03:26:31.086425 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:26:31 crc kubenswrapper[4735]: I0317 03:26:31.098736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerStarted","Data":"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074"} Mar 17 03:26:31 crc kubenswrapper[4735]: I0317 03:26:31.098773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerStarted","Data":"ef05ecd049a133fbe189dadff74a6d808dee7937d45feff05e0906c4759452ea"} Mar 17 03:26:32 crc kubenswrapper[4735]: I0317 03:26:32.086079 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerID="72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074" exitCode=0 Mar 17 03:26:32 crc kubenswrapper[4735]: I0317 03:26:32.086356 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerDied","Data":"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074"} Mar 17 03:26:32 crc kubenswrapper[4735]: I0317 03:26:32.086471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerStarted","Data":"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776"} Mar 17 03:26:34 crc kubenswrapper[4735]: I0317 03:26:34.103823 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerID="4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776" exitCode=0 Mar 17 03:26:34 crc kubenswrapper[4735]: I0317 03:26:34.103884 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerDied","Data":"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776"} Mar 17 03:26:35 crc kubenswrapper[4735]: I0317 03:26:35.121384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerStarted","Data":"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180"} Mar 17 03:26:35 crc kubenswrapper[4735]: I0317 03:26:35.143696 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbtrt" podStartSLOduration=2.71728752 podStartE2EDuration="6.143681155s" podCreationTimestamp="2026-03-17 03:26:29 +0000 UTC" firstStartedPulling="2026-03-17 03:26:31.082724501 +0000 UTC m=+8216.714957479" lastFinishedPulling="2026-03-17 03:26:34.509118136 +0000 UTC m=+8220.141351114" observedRunningTime="2026-03-17 03:26:35.137330431 +0000 UTC m=+8220.769563409" watchObservedRunningTime="2026-03-17 03:26:35.143681155 +0000 UTC m=+8220.775914133" Mar 17 03:26:40 crc kubenswrapper[4735]: I0317 03:26:40.303543 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:40 crc kubenswrapper[4735]: I0317 03:26:40.304073 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:41 crc kubenswrapper[4735]: I0317 03:26:41.073468 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:26:41 crc kubenswrapper[4735]: E0317 03:26:41.074093 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:26:41 crc kubenswrapper[4735]: I0317 03:26:41.354417 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zbtrt" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="registry-server" probeResult="failure" output=< Mar 17 03:26:41 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:26:41 crc kubenswrapper[4735]: > Mar 17 03:26:50 crc kubenswrapper[4735]: I0317 03:26:50.414512 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:50 crc kubenswrapper[4735]: I0317 03:26:50.487944 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:50 crc kubenswrapper[4735]: I0317 03:26:50.666586 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.073430 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.275045 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbtrt" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="registry-server" containerID="cri-o://979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180" gracePeriod=2 Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.275338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2"} Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.820101 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.940541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8bpb\" (UniqueName: \"kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb\") pod \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.940839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities\") pod \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.940896 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content\") pod \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\" (UID: \"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f\") " Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.941365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities" (OuterVolumeSpecName: "utilities") pod "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" (UID: "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.945722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb" (OuterVolumeSpecName: "kube-api-access-f8bpb") pod "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" (UID: "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f"). InnerVolumeSpecName "kube-api-access-f8bpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:26:52 crc kubenswrapper[4735]: I0317 03:26:52.974394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" (UID: "d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.042687 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8bpb\" (UniqueName: \"kubernetes.io/projected/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-kube-api-access-f8bpb\") on node \"crc\" DevicePath \"\"" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.042715 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.042726 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.286114 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerID="979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180" exitCode=0 Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.286154 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbtrt" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.286158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerDied","Data":"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180"} Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.286184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbtrt" event={"ID":"d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f","Type":"ContainerDied","Data":"ef05ecd049a133fbe189dadff74a6d808dee7937d45feff05e0906c4759452ea"} Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.286206 4735 scope.go:117] "RemoveContainer" containerID="979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.310683 4735 scope.go:117] "RemoveContainer" containerID="4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.324299 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.338036 4735 scope.go:117] "RemoveContainer" containerID="72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.341272 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbtrt"] Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.405441 4735 scope.go:117] "RemoveContainer" containerID="979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180" Mar 17 03:26:53 crc kubenswrapper[4735]: E0317 03:26:53.406397 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180\": container with ID starting with 979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180 not found: ID does not exist" containerID="979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.406652 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180"} err="failed to get container status \"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180\": rpc error: code = NotFound desc = could not find container \"979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180\": container with ID starting with 979dc8de460d9f57220775875ae151d4d1975fd0d5890989fe4efcab1b5c2180 not found: ID does not exist" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.406888 4735 scope.go:117] "RemoveContainer" containerID="4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776" Mar 17 03:26:53 crc kubenswrapper[4735]: E0317 03:26:53.407525 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776\": container with ID starting with 4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776 not found: ID does not exist" containerID="4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.407565 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776"} err="failed to get container status \"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776\": rpc error: code = NotFound desc = could not find container \"4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776\": container with ID starting with 4a2545b231c690f0e8ca0f953988c169a164f38df902af8316050223307ac776 not found: ID does not exist" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.407591 4735 scope.go:117] "RemoveContainer" containerID="72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074" Mar 17 03:26:53 crc kubenswrapper[4735]: E0317 03:26:53.408127 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074\": container with ID starting with 72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074 not found: ID does not exist" containerID="72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074" Mar 17 03:26:53 crc kubenswrapper[4735]: I0317 03:26:53.408143 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074"} err="failed to get container status \"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074\": rpc error: code = NotFound desc = could not find container \"72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074\": container with ID starting with 72d29aceaf16f6a0a747cc4e7f02cf04f533ccbc7d9842c46731760f0a29d074 not found: ID does not exist" Mar 17 03:26:55 crc kubenswrapper[4735]: I0317 03:26:55.084470 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" path="/var/lib/kubelet/pods/d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f/volumes" Mar 17 03:27:07 crc kubenswrapper[4735]: I0317 03:27:07.018623 4735 scope.go:117] "RemoveContainer" containerID="d5b0a92de13772b86e234a1781179f6b0065b40950798a6df33ec151d58c4af7" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.161154 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561968-8ntm9"] Mar 17 03:28:00 crc kubenswrapper[4735]: E0317 03:28:00.162255 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="registry-server" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.162276 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="registry-server" Mar 17 03:28:00 crc kubenswrapper[4735]: E0317 03:28:00.162318 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="extract-content" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.162331 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="extract-content" Mar 17 03:28:00 crc kubenswrapper[4735]: E0317 03:28:00.162378 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="extract-utilities" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.162392 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="extract-utilities" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.162749 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cc7fc4-3dd8-4dab-8dd4-3376b5aaa42f" containerName="registry-server" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.163740 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.168948 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.169305 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.169611 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.174985 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561968-8ntm9"] Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.235952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27r65\" (UniqueName: \"kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65\") pod \"auto-csr-approver-29561968-8ntm9\" (UID: \"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be\") " pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.337534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27r65\" (UniqueName: \"kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65\") pod \"auto-csr-approver-29561968-8ntm9\" (UID: \"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be\") " pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.362847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27r65\" (UniqueName: \"kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65\") pod \"auto-csr-approver-29561968-8ntm9\" (UID: \"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be\") " pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:00 crc kubenswrapper[4735]: I0317 03:28:00.482730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:01 crc kubenswrapper[4735]: I0317 03:28:01.003271 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561968-8ntm9"] Mar 17 03:28:02 crc kubenswrapper[4735]: I0317 03:28:02.008043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" event={"ID":"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be","Type":"ContainerStarted","Data":"bf14dbae39573a4518df1d4d3a39f57bd3a89632e1c27d6078dad78e6f66ad64"} Mar 17 03:28:03 crc kubenswrapper[4735]: I0317 03:28:03.022454 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" event={"ID":"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be","Type":"ContainerStarted","Data":"145c0aa1e958ddd92bc7b9f6e7588c32e378eda7d1371db99d40b2378826b2c4"} Mar 17 03:28:03 crc kubenswrapper[4735]: I0317 03:28:03.039967 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" podStartSLOduration=1.934802515 podStartE2EDuration="3.039941666s" podCreationTimestamp="2026-03-17 03:28:00 +0000 UTC" firstStartedPulling="2026-03-17 03:28:01.019296676 +0000 UTC m=+8306.651529654" lastFinishedPulling="2026-03-17 03:28:02.124435817 +0000 UTC m=+8307.756668805" observedRunningTime="2026-03-17 03:28:03.037793063 +0000 UTC m=+8308.670026051" watchObservedRunningTime="2026-03-17 03:28:03.039941666 +0000 UTC m=+8308.672174684" Mar 17 03:28:04 crc kubenswrapper[4735]: I0317 03:28:04.035895 4735 generic.go:334] "Generic (PLEG): container finished" podID="2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" containerID="145c0aa1e958ddd92bc7b9f6e7588c32e378eda7d1371db99d40b2378826b2c4" exitCode=0 Mar 17 03:28:04 crc kubenswrapper[4735]: I0317 03:28:04.035993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" event={"ID":"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be","Type":"ContainerDied","Data":"145c0aa1e958ddd92bc7b9f6e7588c32e378eda7d1371db99d40b2378826b2c4"} Mar 17 03:28:05 crc kubenswrapper[4735]: I0317 03:28:05.462613 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:05 crc kubenswrapper[4735]: I0317 03:28:05.547933 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27r65\" (UniqueName: \"kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65\") pod \"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be\" (UID: \"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be\") " Mar 17 03:28:05 crc kubenswrapper[4735]: I0317 03:28:05.569968 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65" (OuterVolumeSpecName: "kube-api-access-27r65") pod "2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" (UID: "2faba8fe-c48d-44fe-9b5d-dca3efc4a0be"). InnerVolumeSpecName "kube-api-access-27r65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:28:05 crc kubenswrapper[4735]: I0317 03:28:05.650316 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27r65\" (UniqueName: \"kubernetes.io/projected/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be-kube-api-access-27r65\") on node \"crc\" DevicePath \"\"" Mar 17 03:28:06 crc kubenswrapper[4735]: I0317 03:28:06.081135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" event={"ID":"2faba8fe-c48d-44fe-9b5d-dca3efc4a0be","Type":"ContainerDied","Data":"bf14dbae39573a4518df1d4d3a39f57bd3a89632e1c27d6078dad78e6f66ad64"} Mar 17 03:28:06 crc kubenswrapper[4735]: I0317 03:28:06.081420 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf14dbae39573a4518df1d4d3a39f57bd3a89632e1c27d6078dad78e6f66ad64" Mar 17 03:28:06 crc kubenswrapper[4735]: I0317 03:28:06.081254 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561968-8ntm9" Mar 17 03:28:06 crc kubenswrapper[4735]: I0317 03:28:06.122978 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561962-nzj7s"] Mar 17 03:28:06 crc kubenswrapper[4735]: I0317 03:28:06.131636 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561962-nzj7s"] Mar 17 03:28:07 crc kubenswrapper[4735]: I0317 03:28:07.086818 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5d427f-5745-45ff-9e82-75a34b12da44" path="/var/lib/kubelet/pods/eb5d427f-5745-45ff-9e82-75a34b12da44/volumes" Mar 17 03:28:07 crc kubenswrapper[4735]: I0317 03:28:07.126160 4735 scope.go:117] "RemoveContainer" containerID="827682ca782a56404b054affbc8c4de0958f1584861c9c089d744a8c780f57cd" Mar 17 03:29:12 crc kubenswrapper[4735]: I0317 03:29:12.606093 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:29:12 crc kubenswrapper[4735]: I0317 03:29:12.607988 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:29:42 crc kubenswrapper[4735]: I0317 03:29:42.606166 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:29:42 crc kubenswrapper[4735]: I0317 03:29:42.606661 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.157208 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561970-4pwmq"] Mar 17 03:30:00 crc kubenswrapper[4735]: E0317 03:30:00.158824 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" containerName="oc" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.158842 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" containerName="oc" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.159308 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" containerName="oc" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.160456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.163531 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.164024 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.164342 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.194347 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8"] Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.197389 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.199681 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.203704 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.213498 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8"] Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.225446 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561970-4pwmq"] Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.265232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gf7x\" (UniqueName: \"kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.265415 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7n2q\" (UniqueName: \"kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q\") pod \"auto-csr-approver-29561970-4pwmq\" (UID: \"2841700d-a5ad-421e-8715-2071bae5b5e5\") " pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.265612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.265719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.368214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gf7x\" (UniqueName: \"kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.368335 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7n2q\" (UniqueName: \"kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q\") pod \"auto-csr-approver-29561970-4pwmq\" (UID: \"2841700d-a5ad-421e-8715-2071bae5b5e5\") " pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.368447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.368500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.370056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.382048 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.392634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gf7x\" (UniqueName: \"kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x\") pod \"collect-profiles-29561970-fgtm8\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.397210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7n2q\" (UniqueName: \"kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q\") pod \"auto-csr-approver-29561970-4pwmq\" (UID: \"2841700d-a5ad-421e-8715-2071bae5b5e5\") " pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.487448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.525370 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:00 crc kubenswrapper[4735]: I0317 03:30:00.992039 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561970-4pwmq"] Mar 17 03:30:01 crc kubenswrapper[4735]: I0317 03:30:01.050615 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8"] Mar 17 03:30:01 crc kubenswrapper[4735]: W0317 03:30:01.052421 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c13609_dc71_4248_992c_b90968adf138.slice/crio-2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088 WatchSource:0}: Error finding container 2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088: Status 404 returned error can't find the container with id 2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088 Mar 17 03:30:01 crc kubenswrapper[4735]: I0317 03:30:01.321178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" event={"ID":"2841700d-a5ad-421e-8715-2071bae5b5e5","Type":"ContainerStarted","Data":"fd4efd6ffc41179987f5525cb081236e4a313cbab594e5c8fb98626a5ba3655a"} Mar 17 03:30:01 crc kubenswrapper[4735]: I0317 03:30:01.324594 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" event={"ID":"f3c13609-dc71-4248-992c-b90968adf138","Type":"ContainerStarted","Data":"8ed62c91c46c798c6db90079cf7137ad5de0fca61caaadc658cff30ac9f25036"} Mar 17 03:30:01 crc kubenswrapper[4735]: I0317 03:30:01.324628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" event={"ID":"f3c13609-dc71-4248-992c-b90968adf138","Type":"ContainerStarted","Data":"2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088"} Mar 17 03:30:01 crc kubenswrapper[4735]: I0317 03:30:01.348409 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" podStartSLOduration=1.3483856109999999 podStartE2EDuration="1.348385611s" podCreationTimestamp="2026-03-17 03:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 03:30:01.342295565 +0000 UTC m=+8426.974528543" watchObservedRunningTime="2026-03-17 03:30:01.348385611 +0000 UTC m=+8426.980618599" Mar 17 03:30:02 crc kubenswrapper[4735]: I0317 03:30:02.339181 4735 generic.go:334] "Generic (PLEG): container finished" podID="f3c13609-dc71-4248-992c-b90968adf138" containerID="8ed62c91c46c798c6db90079cf7137ad5de0fca61caaadc658cff30ac9f25036" exitCode=0 Mar 17 03:30:02 crc kubenswrapper[4735]: I0317 03:30:02.339273 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" event={"ID":"f3c13609-dc71-4248-992c-b90968adf138","Type":"ContainerDied","Data":"8ed62c91c46c798c6db90079cf7137ad5de0fca61caaadc658cff30ac9f25036"} Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.350433 4735 generic.go:334] "Generic (PLEG): container finished" podID="2841700d-a5ad-421e-8715-2071bae5b5e5" containerID="77734702bd2a30c2d66cbe3aa90a401bd22c8e6f230c29b6f993a54f87092f4b" exitCode=0 Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.350528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" event={"ID":"2841700d-a5ad-421e-8715-2071bae5b5e5","Type":"ContainerDied","Data":"77734702bd2a30c2d66cbe3aa90a401bd22c8e6f230c29b6f993a54f87092f4b"} Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.771901 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.833930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gf7x\" (UniqueName: \"kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x\") pod \"f3c13609-dc71-4248-992c-b90968adf138\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.834215 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume\") pod \"f3c13609-dc71-4248-992c-b90968adf138\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.834238 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume\") pod \"f3c13609-dc71-4248-992c-b90968adf138\" (UID: \"f3c13609-dc71-4248-992c-b90968adf138\") " Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.835673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume" (OuterVolumeSpecName: "config-volume") pod "f3c13609-dc71-4248-992c-b90968adf138" (UID: "f3c13609-dc71-4248-992c-b90968adf138"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.851199 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x" (OuterVolumeSpecName: "kube-api-access-9gf7x") pod "f3c13609-dc71-4248-992c-b90968adf138" (UID: "f3c13609-dc71-4248-992c-b90968adf138"). InnerVolumeSpecName "kube-api-access-9gf7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.851235 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f3c13609-dc71-4248-992c-b90968adf138" (UID: "f3c13609-dc71-4248-992c-b90968adf138"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.935974 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c13609-dc71-4248-992c-b90968adf138-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.936010 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c13609-dc71-4248-992c-b90968adf138-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:30:03 crc kubenswrapper[4735]: I0317 03:30:03.936020 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gf7x\" (UniqueName: \"kubernetes.io/projected/f3c13609-dc71-4248-992c-b90968adf138-kube-api-access-9gf7x\") on node \"crc\" DevicePath \"\"" Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.362468 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.366932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8" event={"ID":"f3c13609-dc71-4248-992c-b90968adf138","Type":"ContainerDied","Data":"2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088"} Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.366991 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2827c799f45609d053005c605950a62193bcef05da45e1211909c5102e10e088" Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.455054 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5"] Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.462976 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-69ws5"] Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.840215 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.954887 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7n2q\" (UniqueName: \"kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q\") pod \"2841700d-a5ad-421e-8715-2071bae5b5e5\" (UID: \"2841700d-a5ad-421e-8715-2071bae5b5e5\") " Mar 17 03:30:04 crc kubenswrapper[4735]: I0317 03:30:04.960124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q" (OuterVolumeSpecName: "kube-api-access-n7n2q") pod "2841700d-a5ad-421e-8715-2071bae5b5e5" (UID: "2841700d-a5ad-421e-8715-2071bae5b5e5"). InnerVolumeSpecName "kube-api-access-n7n2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.057250 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7n2q\" (UniqueName: \"kubernetes.io/projected/2841700d-a5ad-421e-8715-2071bae5b5e5-kube-api-access-n7n2q\") on node \"crc\" DevicePath \"\"" Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.086295 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a45e01-9170-4471-aa81-896799b0bb80" path="/var/lib/kubelet/pods/65a45e01-9170-4471-aa81-896799b0bb80/volumes" Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.370541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" event={"ID":"2841700d-a5ad-421e-8715-2071bae5b5e5","Type":"ContainerDied","Data":"fd4efd6ffc41179987f5525cb081236e4a313cbab594e5c8fb98626a5ba3655a"} Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.370576 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd4efd6ffc41179987f5525cb081236e4a313cbab594e5c8fb98626a5ba3655a" Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.371300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561970-4pwmq" Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.908528 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561964-kmhgl"] Mar 17 03:30:05 crc kubenswrapper[4735]: I0317 03:30:05.919915 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561964-kmhgl"] Mar 17 03:30:07 crc kubenswrapper[4735]: I0317 03:30:07.085272 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68eaa4d9-9f62-40f1-8718-1a046df3cc4c" path="/var/lib/kubelet/pods/68eaa4d9-9f62-40f1-8718-1a046df3cc4c/volumes" Mar 17 03:30:07 crc kubenswrapper[4735]: I0317 03:30:07.238241 4735 scope.go:117] "RemoveContainer" containerID="88b3e4d0d2daee786a639d7bbee7fbeda066e8a45c00538ec664dcb8b839fd48" Mar 17 03:30:07 crc kubenswrapper[4735]: I0317 03:30:07.304760 4735 scope.go:117] "RemoveContainer" containerID="6b2417659dc00e4326c2c7b58569585e1fa96bec7ae9c7a26f7a5121b809086f" Mar 17 03:30:12 crc kubenswrapper[4735]: I0317 03:30:12.606974 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:30:12 crc kubenswrapper[4735]: I0317 03:30:12.607614 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:30:12 crc kubenswrapper[4735]: I0317 03:30:12.607696 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:30:12 crc kubenswrapper[4735]: I0317 03:30:12.608896 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:30:12 crc kubenswrapper[4735]: I0317 03:30:12.608987 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2" gracePeriod=600 Mar 17 03:30:13 crc kubenswrapper[4735]: I0317 03:30:13.482932 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2" exitCode=0 Mar 17 03:30:13 crc kubenswrapper[4735]: I0317 03:30:13.482997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2"} Mar 17 03:30:13 crc kubenswrapper[4735]: I0317 03:30:13.483373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5"} Mar 17 03:30:13 crc kubenswrapper[4735]: I0317 03:30:13.483400 4735 scope.go:117] "RemoveContainer" containerID="4c7fbdae094f085b355113d371af81961a7d3851c1f9cec1478085c30354a5d1" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.149016 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561972-dn6f6"] Mar 17 03:32:00 crc kubenswrapper[4735]: E0317 03:32:00.149932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c13609-dc71-4248-992c-b90968adf138" containerName="collect-profiles" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.149944 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c13609-dc71-4248-992c-b90968adf138" containerName="collect-profiles" Mar 17 03:32:00 crc kubenswrapper[4735]: E0317 03:32:00.149988 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2841700d-a5ad-421e-8715-2071bae5b5e5" containerName="oc" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.149994 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841700d-a5ad-421e-8715-2071bae5b5e5" containerName="oc" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.150185 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2841700d-a5ad-421e-8715-2071bae5b5e5" containerName="oc" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.150209 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c13609-dc71-4248-992c-b90968adf138" containerName="collect-profiles" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.150832 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.154512 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.154759 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.154882 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.167527 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561972-dn6f6"] Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.258571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6r4\" (UniqueName: \"kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4\") pod \"auto-csr-approver-29561972-dn6f6\" (UID: \"462c2898-109d-4525-aa2e-660175f723a7\") " pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.360639 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6r4\" (UniqueName: \"kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4\") pod \"auto-csr-approver-29561972-dn6f6\" (UID: \"462c2898-109d-4525-aa2e-660175f723a7\") " pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.393960 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6r4\" (UniqueName: \"kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4\") pod \"auto-csr-approver-29561972-dn6f6\" (UID: \"462c2898-109d-4525-aa2e-660175f723a7\") " pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:00 crc kubenswrapper[4735]: I0317 03:32:00.470031 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:01 crc kubenswrapper[4735]: I0317 03:32:01.021376 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561972-dn6f6"] Mar 17 03:32:01 crc kubenswrapper[4735]: I0317 03:32:01.026599 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:32:01 crc kubenswrapper[4735]: I0317 03:32:01.847823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" event={"ID":"462c2898-109d-4525-aa2e-660175f723a7","Type":"ContainerStarted","Data":"dacf51c149a052978847317bb091278d926dcdb4e11ccd4951abd251ecbd6399"} Mar 17 03:32:02 crc kubenswrapper[4735]: I0317 03:32:02.861234 4735 generic.go:334] "Generic (PLEG): container finished" podID="462c2898-109d-4525-aa2e-660175f723a7" containerID="84ac71c1b2c299d75773d2c5a8cf5f04dc72fd7f651a318263d3db194296ee7a" exitCode=0 Mar 17 03:32:02 crc kubenswrapper[4735]: I0317 03:32:02.861304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" event={"ID":"462c2898-109d-4525-aa2e-660175f723a7","Type":"ContainerDied","Data":"84ac71c1b2c299d75773d2c5a8cf5f04dc72fd7f651a318263d3db194296ee7a"} Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.339238 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.449335 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6r4\" (UniqueName: \"kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4\") pod \"462c2898-109d-4525-aa2e-660175f723a7\" (UID: \"462c2898-109d-4525-aa2e-660175f723a7\") " Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.460106 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4" (OuterVolumeSpecName: "kube-api-access-2q6r4") pod "462c2898-109d-4525-aa2e-660175f723a7" (UID: "462c2898-109d-4525-aa2e-660175f723a7"). InnerVolumeSpecName "kube-api-access-2q6r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.551740 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6r4\" (UniqueName: \"kubernetes.io/projected/462c2898-109d-4525-aa2e-660175f723a7-kube-api-access-2q6r4\") on node \"crc\" DevicePath \"\"" Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.884829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" event={"ID":"462c2898-109d-4525-aa2e-660175f723a7","Type":"ContainerDied","Data":"dacf51c149a052978847317bb091278d926dcdb4e11ccd4951abd251ecbd6399"} Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.884916 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dacf51c149a052978847317bb091278d926dcdb4e11ccd4951abd251ecbd6399" Mar 17 03:32:04 crc kubenswrapper[4735]: I0317 03:32:04.884920 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561972-dn6f6" Mar 17 03:32:05 crc kubenswrapper[4735]: I0317 03:32:05.442233 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561966-8wptr"] Mar 17 03:32:05 crc kubenswrapper[4735]: I0317 03:32:05.457920 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561966-8wptr"] Mar 17 03:32:07 crc kubenswrapper[4735]: I0317 03:32:07.091542 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25442a6d-8495-493d-a41b-b4bb330d55c7" path="/var/lib/kubelet/pods/25442a6d-8495-493d-a41b-b4bb330d55c7/volumes" Mar 17 03:32:07 crc kubenswrapper[4735]: I0317 03:32:07.453381 4735 scope.go:117] "RemoveContainer" containerID="53129f67dfe8b508478f008111f73c4d07466f2c3c0f0b751d2daca8f6413b99" Mar 17 03:32:12 crc kubenswrapper[4735]: I0317 03:32:12.606348 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:32:12 crc kubenswrapper[4735]: I0317 03:32:12.606847 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:32:42 crc kubenswrapper[4735]: I0317 03:32:42.606243 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:32:42 crc kubenswrapper[4735]: I0317 03:32:42.606840 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:32:49 crc kubenswrapper[4735]: I0317 03:32:49.936799 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:32:49 crc kubenswrapper[4735]: E0317 03:32:49.937636 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462c2898-109d-4525-aa2e-660175f723a7" containerName="oc" Mar 17 03:32:49 crc kubenswrapper[4735]: I0317 03:32:49.937647 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="462c2898-109d-4525-aa2e-660175f723a7" containerName="oc" Mar 17 03:32:49 crc kubenswrapper[4735]: I0317 03:32:49.937867 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="462c2898-109d-4525-aa2e-660175f723a7" containerName="oc" Mar 17 03:32:49 crc kubenswrapper[4735]: I0317 03:32:49.939199 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:49 crc kubenswrapper[4735]: I0317 03:32:49.949326 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.052354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5v8\" (UniqueName: \"kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.052656 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.053000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.155260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5v8\" (UniqueName: \"kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.155657 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.156006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.156213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.157261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.176284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5v8\" (UniqueName: \"kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8\") pod \"redhat-operators-kntpk\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.276503 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:32:50 crc kubenswrapper[4735]: I0317 03:32:50.762882 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:32:51 crc kubenswrapper[4735]: I0317 03:32:51.433427 4735 generic.go:334] "Generic (PLEG): container finished" podID="675cb00a-27d5-43e5-aae9-a919357c716a" containerID="ce499c78bb1b55f5728934a95449c4dfe9d1b6969e8a7e5a991081415068ae1d" exitCode=0 Mar 17 03:32:51 crc kubenswrapper[4735]: I0317 03:32:51.433489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerDied","Data":"ce499c78bb1b55f5728934a95449c4dfe9d1b6969e8a7e5a991081415068ae1d"} Mar 17 03:32:51 crc kubenswrapper[4735]: I0317 03:32:51.433965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerStarted","Data":"c29f61ac2e6fa043b7a761f76bf587ac625ac8b21fed73dc34095d9141e5a7fd"} Mar 17 03:32:52 crc kubenswrapper[4735]: I0317 03:32:52.442270 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerStarted","Data":"e0315d8cc0acd737c3a716b79e4694046e9f35af282f50d84fae5d08baa4f019"} Mar 17 03:32:57 crc kubenswrapper[4735]: I0317 03:32:57.502492 4735 generic.go:334] "Generic (PLEG): container finished" podID="675cb00a-27d5-43e5-aae9-a919357c716a" containerID="e0315d8cc0acd737c3a716b79e4694046e9f35af282f50d84fae5d08baa4f019" exitCode=0 Mar 17 03:32:57 crc kubenswrapper[4735]: I0317 03:32:57.502555 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerDied","Data":"e0315d8cc0acd737c3a716b79e4694046e9f35af282f50d84fae5d08baa4f019"} Mar 17 03:32:58 crc kubenswrapper[4735]: I0317 03:32:58.520535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerStarted","Data":"b2ece01e3e174a3c569ca365ae0ef4c40f8b17641e4065c5ae727e1934ef4e95"} Mar 17 03:32:58 crc kubenswrapper[4735]: I0317 03:32:58.562251 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kntpk" podStartSLOduration=3.062811637 podStartE2EDuration="9.562223209s" podCreationTimestamp="2026-03-17 03:32:49 +0000 UTC" firstStartedPulling="2026-03-17 03:32:51.436588758 +0000 UTC m=+8597.068821736" lastFinishedPulling="2026-03-17 03:32:57.93600033 +0000 UTC m=+8603.568233308" observedRunningTime="2026-03-17 03:32:58.544818428 +0000 UTC m=+8604.177051406" watchObservedRunningTime="2026-03-17 03:32:58.562223209 +0000 UTC m=+8604.194456217" Mar 17 03:33:00 crc kubenswrapper[4735]: I0317 03:33:00.277152 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:00 crc kubenswrapper[4735]: I0317 03:33:00.277564 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:01 crc kubenswrapper[4735]: I0317 03:33:01.339457 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kntpk" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:33:01 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:33:01 crc kubenswrapper[4735]: > Mar 17 03:33:11 crc kubenswrapper[4735]: I0317 03:33:11.320643 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kntpk" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:33:11 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:33:11 crc kubenswrapper[4735]: > Mar 17 03:33:12 crc kubenswrapper[4735]: I0317 03:33:12.606222 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:33:12 crc kubenswrapper[4735]: I0317 03:33:12.606297 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:33:12 crc kubenswrapper[4735]: I0317 03:33:12.606366 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:33:12 crc kubenswrapper[4735]: I0317 03:33:12.607380 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:33:12 crc kubenswrapper[4735]: I0317 03:33:12.607457 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" gracePeriod=600 Mar 17 03:33:12 crc kubenswrapper[4735]: E0317 03:33:12.750038 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:33:13 crc kubenswrapper[4735]: I0317 03:33:13.677296 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" exitCode=0 Mar 17 03:33:13 crc kubenswrapper[4735]: I0317 03:33:13.677335 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5"} Mar 17 03:33:13 crc kubenswrapper[4735]: I0317 03:33:13.677367 4735 scope.go:117] "RemoveContainer" containerID="ce16d7d9e3fd849b9c8c1200e28d72fd41d4e80ffef08977f79639cb676e13b2" Mar 17 03:33:13 crc kubenswrapper[4735]: I0317 03:33:13.677787 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:33:13 crc kubenswrapper[4735]: E0317 03:33:13.678118 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:33:21 crc kubenswrapper[4735]: I0317 03:33:21.367926 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kntpk" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:33:21 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:33:21 crc kubenswrapper[4735]: > Mar 17 03:33:25 crc kubenswrapper[4735]: I0317 03:33:25.081023 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:33:25 crc kubenswrapper[4735]: E0317 03:33:25.081621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:33:31 crc kubenswrapper[4735]: I0317 03:33:31.326813 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kntpk" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:33:31 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:33:31 crc kubenswrapper[4735]: > Mar 17 03:33:37 crc kubenswrapper[4735]: I0317 03:33:37.073004 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:33:37 crc kubenswrapper[4735]: E0317 03:33:37.073853 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:33:40 crc kubenswrapper[4735]: I0317 03:33:40.338016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:40 crc kubenswrapper[4735]: I0317 03:33:40.422538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:40 crc kubenswrapper[4735]: I0317 03:33:40.592917 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:33:41 crc kubenswrapper[4735]: I0317 03:33:41.962637 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kntpk" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" containerID="cri-o://b2ece01e3e174a3c569ca365ae0ef4c40f8b17641e4065c5ae727e1934ef4e95" gracePeriod=2 Mar 17 03:33:42 crc kubenswrapper[4735]: I0317 03:33:42.973521 4735 generic.go:334] "Generic (PLEG): container finished" podID="675cb00a-27d5-43e5-aae9-a919357c716a" containerID="b2ece01e3e174a3c569ca365ae0ef4c40f8b17641e4065c5ae727e1934ef4e95" exitCode=0 Mar 17 03:33:42 crc kubenswrapper[4735]: I0317 03:33:42.973576 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerDied","Data":"b2ece01e3e174a3c569ca365ae0ef4c40f8b17641e4065c5ae727e1934ef4e95"} Mar 17 03:33:42 crc kubenswrapper[4735]: I0317 03:33:42.974330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kntpk" event={"ID":"675cb00a-27d5-43e5-aae9-a919357c716a","Type":"ContainerDied","Data":"c29f61ac2e6fa043b7a761f76bf587ac625ac8b21fed73dc34095d9141e5a7fd"} Mar 17 03:33:42 crc kubenswrapper[4735]: I0317 03:33:42.974359 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29f61ac2e6fa043b7a761f76bf587ac625ac8b21fed73dc34095d9141e5a7fd" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.009109 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.100229 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content\") pod \"675cb00a-27d5-43e5-aae9-a919357c716a\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.100277 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities\") pod \"675cb00a-27d5-43e5-aae9-a919357c716a\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.100350 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5v8\" (UniqueName: \"kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8\") pod \"675cb00a-27d5-43e5-aae9-a919357c716a\" (UID: \"675cb00a-27d5-43e5-aae9-a919357c716a\") " Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.104938 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities" (OuterVolumeSpecName: "utilities") pod "675cb00a-27d5-43e5-aae9-a919357c716a" (UID: "675cb00a-27d5-43e5-aae9-a919357c716a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.115332 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8" (OuterVolumeSpecName: "kube-api-access-mx5v8") pod "675cb00a-27d5-43e5-aae9-a919357c716a" (UID: "675cb00a-27d5-43e5-aae9-a919357c716a"). InnerVolumeSpecName "kube-api-access-mx5v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.203997 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.204034 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5v8\" (UniqueName: \"kubernetes.io/projected/675cb00a-27d5-43e5-aae9-a919357c716a-kube-api-access-mx5v8\") on node \"crc\" DevicePath \"\"" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.224812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "675cb00a-27d5-43e5-aae9-a919357c716a" (UID: "675cb00a-27d5-43e5-aae9-a919357c716a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.306781 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675cb00a-27d5-43e5-aae9-a919357c716a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:33:43 crc kubenswrapper[4735]: I0317 03:33:43.982593 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kntpk" Mar 17 03:33:44 crc kubenswrapper[4735]: I0317 03:33:44.025981 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:33:44 crc kubenswrapper[4735]: I0317 03:33:44.034972 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kntpk"] Mar 17 03:33:45 crc kubenswrapper[4735]: I0317 03:33:45.091903 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" path="/var/lib/kubelet/pods/675cb00a-27d5-43e5-aae9-a919357c716a/volumes" Mar 17 03:33:51 crc kubenswrapper[4735]: I0317 03:33:51.076321 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:33:51 crc kubenswrapper[4735]: E0317 03:33:51.077081 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.150897 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561974-b2d8b"] Mar 17 03:34:00 crc kubenswrapper[4735]: E0317 03:34:00.151617 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.151629 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" Mar 17 03:34:00 crc kubenswrapper[4735]: E0317 03:34:00.151653 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="extract-utilities" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.151661 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="extract-utilities" Mar 17 03:34:00 crc kubenswrapper[4735]: E0317 03:34:00.151689 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="extract-content" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.151694 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="extract-content" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.151891 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="675cb00a-27d5-43e5-aae9-a919357c716a" containerName="registry-server" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.152498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.155289 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.156764 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.157279 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.173625 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561974-b2d8b"] Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.287462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj7rl\" (UniqueName: \"kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl\") pod \"auto-csr-approver-29561974-b2d8b\" (UID: \"f425a620-cd23-4ee5-bb24-f192fe724b5d\") " pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.389965 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj7rl\" (UniqueName: \"kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl\") pod \"auto-csr-approver-29561974-b2d8b\" (UID: \"f425a620-cd23-4ee5-bb24-f192fe724b5d\") " pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.427332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj7rl\" (UniqueName: \"kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl\") pod \"auto-csr-approver-29561974-b2d8b\" (UID: \"f425a620-cd23-4ee5-bb24-f192fe724b5d\") " pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:00 crc kubenswrapper[4735]: I0317 03:34:00.476478 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:01 crc kubenswrapper[4735]: I0317 03:34:01.029456 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561974-b2d8b"] Mar 17 03:34:01 crc kubenswrapper[4735]: I0317 03:34:01.180160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" event={"ID":"f425a620-cd23-4ee5-bb24-f192fe724b5d","Type":"ContainerStarted","Data":"e62ba1802796d878c4b0b0414a3a9c2bfeb8fc56cc88c082904ca9f8335c90cc"} Mar 17 03:34:03 crc kubenswrapper[4735]: I0317 03:34:03.074988 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:34:03 crc kubenswrapper[4735]: E0317 03:34:03.075686 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:03 crc kubenswrapper[4735]: I0317 03:34:03.199646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" event={"ID":"f425a620-cd23-4ee5-bb24-f192fe724b5d","Type":"ContainerStarted","Data":"e41ef9bd9470039d831ebf409793d9082b4ac9b7c543ad8c784fee9fb8e241eb"} Mar 17 03:34:03 crc kubenswrapper[4735]: I0317 03:34:03.223185 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" podStartSLOduration=2.165729725 podStartE2EDuration="3.223154578s" podCreationTimestamp="2026-03-17 03:34:00 +0000 UTC" firstStartedPulling="2026-03-17 03:34:01.033755719 +0000 UTC m=+8666.665988707" lastFinishedPulling="2026-03-17 03:34:02.091180572 +0000 UTC m=+8667.723413560" observedRunningTime="2026-03-17 03:34:03.213194148 +0000 UTC m=+8668.845427136" watchObservedRunningTime="2026-03-17 03:34:03.223154578 +0000 UTC m=+8668.855387596" Mar 17 03:34:04 crc kubenswrapper[4735]: I0317 03:34:04.211244 4735 generic.go:334] "Generic (PLEG): container finished" podID="f425a620-cd23-4ee5-bb24-f192fe724b5d" containerID="e41ef9bd9470039d831ebf409793d9082b4ac9b7c543ad8c784fee9fb8e241eb" exitCode=0 Mar 17 03:34:04 crc kubenswrapper[4735]: I0317 03:34:04.211288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" event={"ID":"f425a620-cd23-4ee5-bb24-f192fe724b5d","Type":"ContainerDied","Data":"e41ef9bd9470039d831ebf409793d9082b4ac9b7c543ad8c784fee9fb8e241eb"} Mar 17 03:34:05 crc kubenswrapper[4735]: I0317 03:34:05.684774 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:05 crc kubenswrapper[4735]: I0317 03:34:05.824072 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj7rl\" (UniqueName: \"kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl\") pod \"f425a620-cd23-4ee5-bb24-f192fe724b5d\" (UID: \"f425a620-cd23-4ee5-bb24-f192fe724b5d\") " Mar 17 03:34:05 crc kubenswrapper[4735]: I0317 03:34:05.830093 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl" (OuterVolumeSpecName: "kube-api-access-bj7rl") pod "f425a620-cd23-4ee5-bb24-f192fe724b5d" (UID: "f425a620-cd23-4ee5-bb24-f192fe724b5d"). InnerVolumeSpecName "kube-api-access-bj7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:34:05 crc kubenswrapper[4735]: I0317 03:34:05.926410 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj7rl\" (UniqueName: \"kubernetes.io/projected/f425a620-cd23-4ee5-bb24-f192fe724b5d-kube-api-access-bj7rl\") on node \"crc\" DevicePath \"\"" Mar 17 03:34:06 crc kubenswrapper[4735]: I0317 03:34:06.227482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" event={"ID":"f425a620-cd23-4ee5-bb24-f192fe724b5d","Type":"ContainerDied","Data":"e62ba1802796d878c4b0b0414a3a9c2bfeb8fc56cc88c082904ca9f8335c90cc"} Mar 17 03:34:06 crc kubenswrapper[4735]: I0317 03:34:06.227513 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561974-b2d8b" Mar 17 03:34:06 crc kubenswrapper[4735]: I0317 03:34:06.227524 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e62ba1802796d878c4b0b0414a3a9c2bfeb8fc56cc88c082904ca9f8335c90cc" Mar 17 03:34:06 crc kubenswrapper[4735]: I0317 03:34:06.316355 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561968-8ntm9"] Mar 17 03:34:06 crc kubenswrapper[4735]: I0317 03:34:06.327478 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561968-8ntm9"] Mar 17 03:34:07 crc kubenswrapper[4735]: I0317 03:34:07.086549 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faba8fe-c48d-44fe-9b5d-dca3efc4a0be" path="/var/lib/kubelet/pods/2faba8fe-c48d-44fe-9b5d-dca3efc4a0be/volumes" Mar 17 03:34:07 crc kubenswrapper[4735]: I0317 03:34:07.542803 4735 scope.go:117] "RemoveContainer" containerID="145c0aa1e958ddd92bc7b9f6e7588c32e378eda7d1371db99d40b2378826b2c4" Mar 17 03:34:15 crc kubenswrapper[4735]: I0317 03:34:15.085025 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:34:15 crc kubenswrapper[4735]: E0317 03:34:15.086360 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:29 crc kubenswrapper[4735]: I0317 03:34:29.073828 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:34:29 crc kubenswrapper[4735]: E0317 03:34:29.074720 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.351455 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:38 crc kubenswrapper[4735]: E0317 03:34:38.352354 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f425a620-cd23-4ee5-bb24-f192fe724b5d" containerName="oc" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.352367 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f425a620-cd23-4ee5-bb24-f192fe724b5d" containerName="oc" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.352566 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f425a620-cd23-4ee5-bb24-f192fe724b5d" containerName="oc" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.354104 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.382323 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.476138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.476372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.476433 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzx5\" (UniqueName: \"kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.578631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.578692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzx5\" (UniqueName: \"kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.578840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.579341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.579983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.606458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzx5\" (UniqueName: \"kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5\") pod \"community-operators-m6sjh\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:38 crc kubenswrapper[4735]: I0317 03:34:38.708790 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:39 crc kubenswrapper[4735]: I0317 03:34:39.200678 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:39 crc kubenswrapper[4735]: I0317 03:34:39.649609 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerID="8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba" exitCode=0 Mar 17 03:34:39 crc kubenswrapper[4735]: I0317 03:34:39.649681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerDied","Data":"8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba"} Mar 17 03:34:39 crc kubenswrapper[4735]: I0317 03:34:39.649748 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerStarted","Data":"023623c10768863f312515f99b014960a8bef9c802bd646738a30f8760145999"} Mar 17 03:34:40 crc kubenswrapper[4735]: I0317 03:34:40.660761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerStarted","Data":"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17"} Mar 17 03:34:41 crc kubenswrapper[4735]: I0317 03:34:41.073206 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:34:41 crc kubenswrapper[4735]: E0317 03:34:41.073850 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:42 crc kubenswrapper[4735]: I0317 03:34:42.682568 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerID="c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17" exitCode=0 Mar 17 03:34:42 crc kubenswrapper[4735]: I0317 03:34:42.682616 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerDied","Data":"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17"} Mar 17 03:34:43 crc kubenswrapper[4735]: I0317 03:34:43.694123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerStarted","Data":"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6"} Mar 17 03:34:43 crc kubenswrapper[4735]: I0317 03:34:43.717119 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6sjh" podStartSLOduration=2.133212337 podStartE2EDuration="5.717100639s" podCreationTimestamp="2026-03-17 03:34:38 +0000 UTC" firstStartedPulling="2026-03-17 03:34:39.652438165 +0000 UTC m=+8705.284671143" lastFinishedPulling="2026-03-17 03:34:43.236326457 +0000 UTC m=+8708.868559445" observedRunningTime="2026-03-17 03:34:43.710622352 +0000 UTC m=+8709.342855340" watchObservedRunningTime="2026-03-17 03:34:43.717100639 +0000 UTC m=+8709.349333627" Mar 17 03:34:48 crc kubenswrapper[4735]: I0317 03:34:48.709804 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:48 crc kubenswrapper[4735]: I0317 03:34:48.710579 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:48 crc kubenswrapper[4735]: I0317 03:34:48.775915 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:48 crc kubenswrapper[4735]: I0317 03:34:48.845730 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:49 crc kubenswrapper[4735]: I0317 03:34:49.025658 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:50 crc kubenswrapper[4735]: I0317 03:34:50.752022 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6sjh" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="registry-server" containerID="cri-o://ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6" gracePeriod=2 Mar 17 03:34:50 crc kubenswrapper[4735]: E0317 03:34:50.976264 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a796b70_bb0e_4989_8164_66544f58ae0f.slice/crio-conmon-ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a796b70_bb0e_4989_8164_66544f58ae0f.slice/crio-ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6.scope\": RecentStats: unable to find data in memory cache]" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.293512 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.406790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content\") pod \"1a796b70-bb0e-4989-8164-66544f58ae0f\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.406986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzx5\" (UniqueName: \"kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5\") pod \"1a796b70-bb0e-4989-8164-66544f58ae0f\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.407152 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities\") pod \"1a796b70-bb0e-4989-8164-66544f58ae0f\" (UID: \"1a796b70-bb0e-4989-8164-66544f58ae0f\") " Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.407932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities" (OuterVolumeSpecName: "utilities") pod "1a796b70-bb0e-4989-8164-66544f58ae0f" (UID: "1a796b70-bb0e-4989-8164-66544f58ae0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.419012 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5" (OuterVolumeSpecName: "kube-api-access-5xzx5") pod "1a796b70-bb0e-4989-8164-66544f58ae0f" (UID: "1a796b70-bb0e-4989-8164-66544f58ae0f"). InnerVolumeSpecName "kube-api-access-5xzx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.465995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a796b70-bb0e-4989-8164-66544f58ae0f" (UID: "1a796b70-bb0e-4989-8164-66544f58ae0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.510572 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzx5\" (UniqueName: \"kubernetes.io/projected/1a796b70-bb0e-4989-8164-66544f58ae0f-kube-api-access-5xzx5\") on node \"crc\" DevicePath \"\"" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.510624 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.510643 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a796b70-bb0e-4989-8164-66544f58ae0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.763162 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerID="ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6" exitCode=0 Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.763210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerDied","Data":"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6"} Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.763236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6sjh" event={"ID":"1a796b70-bb0e-4989-8164-66544f58ae0f","Type":"ContainerDied","Data":"023623c10768863f312515f99b014960a8bef9c802bd646738a30f8760145999"} Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.763252 4735 scope.go:117] "RemoveContainer" containerID="ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.763302 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6sjh" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.806278 4735 scope.go:117] "RemoveContainer" containerID="c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.853353 4735 scope.go:117] "RemoveContainer" containerID="8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.855075 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.865182 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6sjh"] Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.889300 4735 scope.go:117] "RemoveContainer" containerID="ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6" Mar 17 03:34:51 crc kubenswrapper[4735]: E0317 03:34:51.891545 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6\": container with ID starting with ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6 not found: ID does not exist" containerID="ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.891665 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6"} err="failed to get container status \"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6\": rpc error: code = NotFound desc = could not find container \"ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6\": container with ID starting with ac6d8a52e2b9f7ca091903b20de50c0e07783ec340873229776dc2fa71cc5cc6 not found: ID does not exist" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.891759 4735 scope.go:117] "RemoveContainer" containerID="c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17" Mar 17 03:34:51 crc kubenswrapper[4735]: E0317 03:34:51.892250 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17\": container with ID starting with c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17 not found: ID does not exist" containerID="c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.892299 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17"} err="failed to get container status \"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17\": rpc error: code = NotFound desc = could not find container \"c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17\": container with ID starting with c401d634d8d2cb0aef05648f250f3fbb2ea6a7ef83b998018c8e9d43f0e4bc17 not found: ID does not exist" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.892325 4735 scope.go:117] "RemoveContainer" containerID="8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba" Mar 17 03:34:51 crc kubenswrapper[4735]: E0317 03:34:51.892613 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba\": container with ID starting with 8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba not found: ID does not exist" containerID="8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba" Mar 17 03:34:51 crc kubenswrapper[4735]: I0317 03:34:51.892636 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba"} err="failed to get container status \"8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba\": rpc error: code = NotFound desc = could not find container \"8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba\": container with ID starting with 8005de5a59803c76602dcc7c1911520fd9fbec53863af8ea477f79eb39c261ba not found: ID does not exist" Mar 17 03:34:53 crc kubenswrapper[4735]: I0317 03:34:53.080497 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:34:53 crc kubenswrapper[4735]: E0317 03:34:53.082217 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:34:53 crc kubenswrapper[4735]: I0317 03:34:53.085265 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" path="/var/lib/kubelet/pods/1a796b70-bb0e-4989-8164-66544f58ae0f/volumes" Mar 17 03:35:08 crc kubenswrapper[4735]: I0317 03:35:08.073848 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:35:08 crc kubenswrapper[4735]: E0317 03:35:08.074644 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:35:21 crc kubenswrapper[4735]: I0317 03:35:21.072884 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:35:21 crc kubenswrapper[4735]: E0317 03:35:21.074651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:35:33 crc kubenswrapper[4735]: I0317 03:35:33.073434 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:35:33 crc kubenswrapper[4735]: E0317 03:35:33.074655 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:35:47 crc kubenswrapper[4735]: I0317 03:35:47.073795 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:35:47 crc kubenswrapper[4735]: E0317 03:35:47.074484 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:35:59 crc kubenswrapper[4735]: I0317 03:35:59.073380 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:35:59 crc kubenswrapper[4735]: E0317 03:35:59.074846 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.144645 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561976-f6s7n"] Mar 17 03:36:00 crc kubenswrapper[4735]: E0317 03:36:00.145107 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="extract-utilities" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.145123 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="extract-utilities" Mar 17 03:36:00 crc kubenswrapper[4735]: E0317 03:36:00.145171 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="extract-content" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.145180 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="extract-content" Mar 17 03:36:00 crc kubenswrapper[4735]: E0317 03:36:00.145206 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="registry-server" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.145214 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="registry-server" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.145411 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a796b70-bb0e-4989-8164-66544f58ae0f" containerName="registry-server" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.146124 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.149215 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.150002 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.150023 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.166133 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561976-f6s7n"] Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.214086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxmw\" (UniqueName: \"kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw\") pod \"auto-csr-approver-29561976-f6s7n\" (UID: \"b41b877d-4fd6-46e3-bd90-801cbd9f5660\") " pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.316415 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxmw\" (UniqueName: \"kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw\") pod \"auto-csr-approver-29561976-f6s7n\" (UID: \"b41b877d-4fd6-46e3-bd90-801cbd9f5660\") " pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.337918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxmw\" (UniqueName: \"kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw\") pod \"auto-csr-approver-29561976-f6s7n\" (UID: \"b41b877d-4fd6-46e3-bd90-801cbd9f5660\") " pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.470368 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:00 crc kubenswrapper[4735]: I0317 03:36:00.990151 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561976-f6s7n"] Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.556309 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.559724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.575868 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.602431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" event={"ID":"b41b877d-4fd6-46e3-bd90-801cbd9f5660","Type":"ContainerStarted","Data":"d1af35a2c6ff218b6c56845de552056ed7fe0f7395c364a86911e2fc9a6fdd70"} Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.650897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jntb\" (UniqueName: \"kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.651593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.651703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.754169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.754250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.754357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jntb\" (UniqueName: \"kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.755018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.755167 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.782053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jntb\" (UniqueName: \"kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb\") pod \"redhat-marketplace-4f6ww\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:01 crc kubenswrapper[4735]: I0317 03:36:01.888045 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.417024 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.612743 4735 generic.go:334] "Generic (PLEG): container finished" podID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerID="53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6" exitCode=0 Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.612823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerDied","Data":"53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6"} Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.613158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerStarted","Data":"7400729af36949e9d7aa306972247b5088bae051f15521f83c1e63e7a3ec81af"} Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.615435 4735 generic.go:334] "Generic (PLEG): container finished" podID="b41b877d-4fd6-46e3-bd90-801cbd9f5660" containerID="545c20ca69af9b5255e18d61d942a61b8d9cbc96cc34e9975824abeec06f4856" exitCode=0 Mar 17 03:36:02 crc kubenswrapper[4735]: I0317 03:36:02.615460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" event={"ID":"b41b877d-4fd6-46e3-bd90-801cbd9f5660","Type":"ContainerDied","Data":"545c20ca69af9b5255e18d61d942a61b8d9cbc96cc34e9975824abeec06f4856"} Mar 17 03:36:02 crc kubenswrapper[4735]: E0317 03:36:02.809784 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41b877d_4fd6_46e3_bd90_801cbd9f5660.slice/crio-conmon-545c20ca69af9b5255e18d61d942a61b8d9cbc96cc34e9975824abeec06f4856.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41b877d_4fd6_46e3_bd90_801cbd9f5660.slice/crio-545c20ca69af9b5255e18d61d942a61b8d9cbc96cc34e9975824abeec06f4856.scope\": RecentStats: unable to find data in memory cache]" Mar 17 03:36:03 crc kubenswrapper[4735]: I0317 03:36:03.628348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerStarted","Data":"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d"} Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.224501 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.315037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkxmw\" (UniqueName: \"kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw\") pod \"b41b877d-4fd6-46e3-bd90-801cbd9f5660\" (UID: \"b41b877d-4fd6-46e3-bd90-801cbd9f5660\") " Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.331054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw" (OuterVolumeSpecName: "kube-api-access-zkxmw") pod "b41b877d-4fd6-46e3-bd90-801cbd9f5660" (UID: "b41b877d-4fd6-46e3-bd90-801cbd9f5660"). InnerVolumeSpecName "kube-api-access-zkxmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.417735 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkxmw\" (UniqueName: \"kubernetes.io/projected/b41b877d-4fd6-46e3-bd90-801cbd9f5660-kube-api-access-zkxmw\") on node \"crc\" DevicePath \"\"" Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.638534 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.638533 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561976-f6s7n" event={"ID":"b41b877d-4fd6-46e3-bd90-801cbd9f5660","Type":"ContainerDied","Data":"d1af35a2c6ff218b6c56845de552056ed7fe0f7395c364a86911e2fc9a6fdd70"} Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.639708 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1af35a2c6ff218b6c56845de552056ed7fe0f7395c364a86911e2fc9a6fdd70" Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.641767 4735 generic.go:334] "Generic (PLEG): container finished" podID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerID="ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d" exitCode=0 Mar 17 03:36:04 crc kubenswrapper[4735]: I0317 03:36:04.641802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerDied","Data":"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d"} Mar 17 03:36:05 crc kubenswrapper[4735]: I0317 03:36:05.293714 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561970-4pwmq"] Mar 17 03:36:05 crc kubenswrapper[4735]: I0317 03:36:05.302694 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561970-4pwmq"] Mar 17 03:36:05 crc kubenswrapper[4735]: I0317 03:36:05.654204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerStarted","Data":"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694"} Mar 17 03:36:05 crc kubenswrapper[4735]: I0317 03:36:05.676836 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4f6ww" podStartSLOduration=2.192092472 podStartE2EDuration="4.67681731s" podCreationTimestamp="2026-03-17 03:36:01 +0000 UTC" firstStartedPulling="2026-03-17 03:36:02.616050005 +0000 UTC m=+8788.248282983" lastFinishedPulling="2026-03-17 03:36:05.100774843 +0000 UTC m=+8790.733007821" observedRunningTime="2026-03-17 03:36:05.672007863 +0000 UTC m=+8791.304240841" watchObservedRunningTime="2026-03-17 03:36:05.67681731 +0000 UTC m=+8791.309050288" Mar 17 03:36:07 crc kubenswrapper[4735]: I0317 03:36:07.083699 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2841700d-a5ad-421e-8715-2071bae5b5e5" path="/var/lib/kubelet/pods/2841700d-a5ad-421e-8715-2071bae5b5e5/volumes" Mar 17 03:36:07 crc kubenswrapper[4735]: I0317 03:36:07.660186 4735 scope.go:117] "RemoveContainer" containerID="77734702bd2a30c2d66cbe3aa90a401bd22c8e6f230c29b6f993a54f87092f4b" Mar 17 03:36:11 crc kubenswrapper[4735]: I0317 03:36:11.075162 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:36:11 crc kubenswrapper[4735]: E0317 03:36:11.075786 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:36:11 crc kubenswrapper[4735]: I0317 03:36:11.888837 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:11 crc kubenswrapper[4735]: I0317 03:36:11.888933 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:11 crc kubenswrapper[4735]: I0317 03:36:11.938740 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:12 crc kubenswrapper[4735]: I0317 03:36:12.793416 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:12 crc kubenswrapper[4735]: I0317 03:36:12.862420 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:14 crc kubenswrapper[4735]: I0317 03:36:14.752035 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4f6ww" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="registry-server" containerID="cri-o://7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694" gracePeriod=2 Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.292980 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.430897 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jntb\" (UniqueName: \"kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb\") pod \"265b4768-95df-435d-b6cd-e50bf1c9abcc\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.431251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities\") pod \"265b4768-95df-435d-b6cd-e50bf1c9abcc\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.432081 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities" (OuterVolumeSpecName: "utilities") pod "265b4768-95df-435d-b6cd-e50bf1c9abcc" (UID: "265b4768-95df-435d-b6cd-e50bf1c9abcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.432417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content\") pod \"265b4768-95df-435d-b6cd-e50bf1c9abcc\" (UID: \"265b4768-95df-435d-b6cd-e50bf1c9abcc\") " Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.434902 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.447049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb" (OuterVolumeSpecName: "kube-api-access-7jntb") pod "265b4768-95df-435d-b6cd-e50bf1c9abcc" (UID: "265b4768-95df-435d-b6cd-e50bf1c9abcc"). InnerVolumeSpecName "kube-api-access-7jntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.457156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265b4768-95df-435d-b6cd-e50bf1c9abcc" (UID: "265b4768-95df-435d-b6cd-e50bf1c9abcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.537012 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265b4768-95df-435d-b6cd-e50bf1c9abcc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.537308 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jntb\" (UniqueName: \"kubernetes.io/projected/265b4768-95df-435d-b6cd-e50bf1c9abcc-kube-api-access-7jntb\") on node \"crc\" DevicePath \"\"" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.766506 4735 generic.go:334] "Generic (PLEG): container finished" podID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerID="7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694" exitCode=0 Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.766614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerDied","Data":"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694"} Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.766700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f6ww" event={"ID":"265b4768-95df-435d-b6cd-e50bf1c9abcc","Type":"ContainerDied","Data":"7400729af36949e9d7aa306972247b5088bae051f15521f83c1e63e7a3ec81af"} Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.766737 4735 scope.go:117] "RemoveContainer" containerID="7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.767416 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f6ww" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.803598 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.811796 4735 scope.go:117] "RemoveContainer" containerID="ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.815609 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f6ww"] Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.840419 4735 scope.go:117] "RemoveContainer" containerID="53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.896083 4735 scope.go:117] "RemoveContainer" containerID="7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694" Mar 17 03:36:15 crc kubenswrapper[4735]: E0317 03:36:15.896683 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694\": container with ID starting with 7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694 not found: ID does not exist" containerID="7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.896757 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694"} err="failed to get container status \"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694\": rpc error: code = NotFound desc = could not find container \"7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694\": container with ID starting with 7046765b34ef03d30e3b53c0dad49aaa7b2425aeb854010fab1b50b21a736694 not found: ID does not exist" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.896791 4735 scope.go:117] "RemoveContainer" containerID="ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d" Mar 17 03:36:15 crc kubenswrapper[4735]: E0317 03:36:15.897320 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d\": container with ID starting with ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d not found: ID does not exist" containerID="ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.897361 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d"} err="failed to get container status \"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d\": rpc error: code = NotFound desc = could not find container \"ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d\": container with ID starting with ce1f070255d57355a57ac380ec9a883f620c40400804d268d5aeee935e31547d not found: ID does not exist" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.897392 4735 scope.go:117] "RemoveContainer" containerID="53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6" Mar 17 03:36:15 crc kubenswrapper[4735]: E0317 03:36:15.897953 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6\": container with ID starting with 53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6 not found: ID does not exist" containerID="53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6" Mar 17 03:36:15 crc kubenswrapper[4735]: I0317 03:36:15.898009 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6"} err="failed to get container status \"53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6\": rpc error: code = NotFound desc = could not find container \"53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6\": container with ID starting with 53c6383943c382936de2488aea724172c6eb77533effcd01182584c43b1e3df6 not found: ID does not exist" Mar 17 03:36:17 crc kubenswrapper[4735]: I0317 03:36:17.082447 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" path="/var/lib/kubelet/pods/265b4768-95df-435d-b6cd-e50bf1c9abcc/volumes" Mar 17 03:36:25 crc kubenswrapper[4735]: I0317 03:36:25.080979 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:36:25 crc kubenswrapper[4735]: E0317 03:36:25.082187 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:36:38 crc kubenswrapper[4735]: I0317 03:36:38.072788 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:36:38 crc kubenswrapper[4735]: E0317 03:36:38.073671 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:36:52 crc kubenswrapper[4735]: I0317 03:36:52.073677 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:36:52 crc kubenswrapper[4735]: E0317 03:36:52.074552 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.151325 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:01 crc kubenswrapper[4735]: E0317 03:37:01.152688 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="extract-content" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.152711 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="extract-content" Mar 17 03:37:01 crc kubenswrapper[4735]: E0317 03:37:01.152741 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="registry-server" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.152757 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="registry-server" Mar 17 03:37:01 crc kubenswrapper[4735]: E0317 03:37:01.152809 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41b877d-4fd6-46e3-bd90-801cbd9f5660" containerName="oc" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.152824 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41b877d-4fd6-46e3-bd90-801cbd9f5660" containerName="oc" Mar 17 03:37:01 crc kubenswrapper[4735]: E0317 03:37:01.152854 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="extract-utilities" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.152894 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="extract-utilities" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.153287 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41b877d-4fd6-46e3-bd90-801cbd9f5660" containerName="oc" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.153311 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="265b4768-95df-435d-b6cd-e50bf1c9abcc" containerName="registry-server" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.155234 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.167819 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.242151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.242228 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flgq\" (UniqueName: \"kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.242265 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.344084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.344170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9flgq\" (UniqueName: \"kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.344211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.344622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.344750 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.361983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flgq\" (UniqueName: \"kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq\") pod \"certified-operators-dprzp\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:01 crc kubenswrapper[4735]: I0317 03:37:01.518810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:02 crc kubenswrapper[4735]: I0317 03:37:02.023677 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:02 crc kubenswrapper[4735]: W0317 03:37:02.030266 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d950b9_2154_4ab5_969d_a757bba6ba91.slice/crio-0cbcb2594bd5a877083b0dbd6d2907ff39711b59a039d541b0d07e664dfc45dd WatchSource:0}: Error finding container 0cbcb2594bd5a877083b0dbd6d2907ff39711b59a039d541b0d07e664dfc45dd: Status 404 returned error can't find the container with id 0cbcb2594bd5a877083b0dbd6d2907ff39711b59a039d541b0d07e664dfc45dd Mar 17 03:37:02 crc kubenswrapper[4735]: I0317 03:37:02.235136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerStarted","Data":"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f"} Mar 17 03:37:02 crc kubenswrapper[4735]: I0317 03:37:02.235174 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerStarted","Data":"0cbcb2594bd5a877083b0dbd6d2907ff39711b59a039d541b0d07e664dfc45dd"} Mar 17 03:37:03 crc kubenswrapper[4735]: I0317 03:37:03.247101 4735 generic.go:334] "Generic (PLEG): container finished" podID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerID="d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f" exitCode=0 Mar 17 03:37:03 crc kubenswrapper[4735]: I0317 03:37:03.247159 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerDied","Data":"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f"} Mar 17 03:37:03 crc kubenswrapper[4735]: I0317 03:37:03.251619 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:37:04 crc kubenswrapper[4735]: I0317 03:37:04.258561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerStarted","Data":"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca"} Mar 17 03:37:06 crc kubenswrapper[4735]: I0317 03:37:06.281791 4735 generic.go:334] "Generic (PLEG): container finished" podID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerID="2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca" exitCode=0 Mar 17 03:37:06 crc kubenswrapper[4735]: I0317 03:37:06.283577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerDied","Data":"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca"} Mar 17 03:37:07 crc kubenswrapper[4735]: I0317 03:37:07.073440 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:37:07 crc kubenswrapper[4735]: E0317 03:37:07.074049 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:37:07 crc kubenswrapper[4735]: I0317 03:37:07.299223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerStarted","Data":"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219"} Mar 17 03:37:07 crc kubenswrapper[4735]: I0317 03:37:07.351022 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dprzp" podStartSLOduration=2.877902399 podStartE2EDuration="6.350997642s" podCreationTimestamp="2026-03-17 03:37:01 +0000 UTC" firstStartedPulling="2026-03-17 03:37:03.249329703 +0000 UTC m=+8848.881562691" lastFinishedPulling="2026-03-17 03:37:06.722424946 +0000 UTC m=+8852.354657934" observedRunningTime="2026-03-17 03:37:07.337125696 +0000 UTC m=+8852.969358714" watchObservedRunningTime="2026-03-17 03:37:07.350997642 +0000 UTC m=+8852.983230660" Mar 17 03:37:11 crc kubenswrapper[4735]: I0317 03:37:11.519316 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:11 crc kubenswrapper[4735]: I0317 03:37:11.519960 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:12 crc kubenswrapper[4735]: I0317 03:37:12.593610 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dprzp" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="registry-server" probeResult="failure" output=< Mar 17 03:37:12 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:37:12 crc kubenswrapper[4735]: > Mar 17 03:37:19 crc kubenswrapper[4735]: I0317 03:37:19.075117 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:37:19 crc kubenswrapper[4735]: E0317 03:37:19.076189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:37:21 crc kubenswrapper[4735]: I0317 03:37:21.582292 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:21 crc kubenswrapper[4735]: I0317 03:37:21.643555 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:21 crc kubenswrapper[4735]: I0317 03:37:21.827978 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:23 crc kubenswrapper[4735]: I0317 03:37:23.452713 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dprzp" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="registry-server" containerID="cri-o://15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219" gracePeriod=2 Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.054645 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.169037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content\") pod \"11d950b9-2154-4ab5-969d-a757bba6ba91\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.169110 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9flgq\" (UniqueName: \"kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq\") pod \"11d950b9-2154-4ab5-969d-a757bba6ba91\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.169154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities\") pod \"11d950b9-2154-4ab5-969d-a757bba6ba91\" (UID: \"11d950b9-2154-4ab5-969d-a757bba6ba91\") " Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.171839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities" (OuterVolumeSpecName: "utilities") pod "11d950b9-2154-4ab5-969d-a757bba6ba91" (UID: "11d950b9-2154-4ab5-969d-a757bba6ba91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.180722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq" (OuterVolumeSpecName: "kube-api-access-9flgq") pod "11d950b9-2154-4ab5-969d-a757bba6ba91" (UID: "11d950b9-2154-4ab5-969d-a757bba6ba91"). InnerVolumeSpecName "kube-api-access-9flgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.241041 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11d950b9-2154-4ab5-969d-a757bba6ba91" (UID: "11d950b9-2154-4ab5-969d-a757bba6ba91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.272215 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.272270 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9flgq\" (UniqueName: \"kubernetes.io/projected/11d950b9-2154-4ab5-969d-a757bba6ba91-kube-api-access-9flgq\") on node \"crc\" DevicePath \"\"" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.272290 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d950b9-2154-4ab5-969d-a757bba6ba91-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.466404 4735 generic.go:334] "Generic (PLEG): container finished" podID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerID="15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219" exitCode=0 Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.466443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerDied","Data":"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219"} Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.466468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dprzp" event={"ID":"11d950b9-2154-4ab5-969d-a757bba6ba91","Type":"ContainerDied","Data":"0cbcb2594bd5a877083b0dbd6d2907ff39711b59a039d541b0d07e664dfc45dd"} Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.466486 4735 scope.go:117] "RemoveContainer" containerID="15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.466621 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dprzp" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.505066 4735 scope.go:117] "RemoveContainer" containerID="2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.517132 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.528674 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dprzp"] Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.534112 4735 scope.go:117] "RemoveContainer" containerID="d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.583886 4735 scope.go:117] "RemoveContainer" containerID="15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219" Mar 17 03:37:24 crc kubenswrapper[4735]: E0317 03:37:24.584560 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219\": container with ID starting with 15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219 not found: ID does not exist" containerID="15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.584593 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219"} err="failed to get container status \"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219\": rpc error: code = NotFound desc = could not find container \"15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219\": container with ID starting with 15c38a6990ca81793c6a61327e4b5ecef6d778494da14c9f3b564e70330db219 not found: ID does not exist" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.584614 4735 scope.go:117] "RemoveContainer" containerID="2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca" Mar 17 03:37:24 crc kubenswrapper[4735]: E0317 03:37:24.584895 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca\": container with ID starting with 2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca not found: ID does not exist" containerID="2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.584926 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca"} err="failed to get container status \"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca\": rpc error: code = NotFound desc = could not find container \"2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca\": container with ID starting with 2fc9433302ef05cfd978b59cee97752f7b5bec5001a66e5fd4bc4cdac351a4ca not found: ID does not exist" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.584945 4735 scope.go:117] "RemoveContainer" containerID="d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f" Mar 17 03:37:24 crc kubenswrapper[4735]: E0317 03:37:24.585193 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f\": container with ID starting with d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f not found: ID does not exist" containerID="d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f" Mar 17 03:37:24 crc kubenswrapper[4735]: I0317 03:37:24.585300 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f"} err="failed to get container status \"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f\": rpc error: code = NotFound desc = could not find container \"d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f\": container with ID starting with d37942e623b13319eae0227c1fceca6cc3e081adde03b4f265b42ae014b6cb5f not found: ID does not exist" Mar 17 03:37:25 crc kubenswrapper[4735]: I0317 03:37:25.084668 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" path="/var/lib/kubelet/pods/11d950b9-2154-4ab5-969d-a757bba6ba91/volumes" Mar 17 03:37:33 crc kubenswrapper[4735]: I0317 03:37:33.073018 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:37:33 crc kubenswrapper[4735]: E0317 03:37:33.075071 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:37:44 crc kubenswrapper[4735]: I0317 03:37:44.073450 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:37:44 crc kubenswrapper[4735]: E0317 03:37:44.074537 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:37:58 crc kubenswrapper[4735]: I0317 03:37:58.073853 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:37:58 crc kubenswrapper[4735]: E0317 03:37:58.075201 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.154500 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561978-gr75q"] Mar 17 03:38:00 crc kubenswrapper[4735]: E0317 03:38:00.155341 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="extract-content" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.155358 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="extract-content" Mar 17 03:38:00 crc kubenswrapper[4735]: E0317 03:38:00.155376 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="extract-utilities" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.155383 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="extract-utilities" Mar 17 03:38:00 crc kubenswrapper[4735]: E0317 03:38:00.155412 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="registry-server" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.155420 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="registry-server" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.155683 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d950b9-2154-4ab5-969d-a757bba6ba91" containerName="registry-server" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.156480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.160799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.162552 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.162824 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.172422 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561978-gr75q"] Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.303528 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskpg\" (UniqueName: \"kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg\") pod \"auto-csr-approver-29561978-gr75q\" (UID: \"9b80fdd5-0ad0-4d77-9eb0-1196019ca352\") " pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.405635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskpg\" (UniqueName: \"kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg\") pod \"auto-csr-approver-29561978-gr75q\" (UID: \"9b80fdd5-0ad0-4d77-9eb0-1196019ca352\") " pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.432038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskpg\" (UniqueName: \"kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg\") pod \"auto-csr-approver-29561978-gr75q\" (UID: \"9b80fdd5-0ad0-4d77-9eb0-1196019ca352\") " pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.478288 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:00 crc kubenswrapper[4735]: I0317 03:38:00.982822 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561978-gr75q"] Mar 17 03:38:01 crc kubenswrapper[4735]: I0317 03:38:01.871150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561978-gr75q" event={"ID":"9b80fdd5-0ad0-4d77-9eb0-1196019ca352","Type":"ContainerStarted","Data":"5f02b9652a6dbdb57e1abcea9a2855ac3e281590c4c775c25431a00d31f22f86"} Mar 17 03:38:02 crc kubenswrapper[4735]: I0317 03:38:02.884076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561978-gr75q" event={"ID":"9b80fdd5-0ad0-4d77-9eb0-1196019ca352","Type":"ContainerStarted","Data":"b92d3082b49e865447277f99bdf275ebf1db17038dd2007e0f51dfd1408108b5"} Mar 17 03:38:02 crc kubenswrapper[4735]: I0317 03:38:02.899812 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561978-gr75q" podStartSLOduration=1.919961765 podStartE2EDuration="2.899794153s" podCreationTimestamp="2026-03-17 03:38:00 +0000 UTC" firstStartedPulling="2026-03-17 03:38:00.998271813 +0000 UTC m=+8906.630504791" lastFinishedPulling="2026-03-17 03:38:01.978104181 +0000 UTC m=+8907.610337179" observedRunningTime="2026-03-17 03:38:02.898036451 +0000 UTC m=+8908.530269439" watchObservedRunningTime="2026-03-17 03:38:02.899794153 +0000 UTC m=+8908.532027141" Mar 17 03:38:03 crc kubenswrapper[4735]: I0317 03:38:03.898884 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b80fdd5-0ad0-4d77-9eb0-1196019ca352" containerID="b92d3082b49e865447277f99bdf275ebf1db17038dd2007e0f51dfd1408108b5" exitCode=0 Mar 17 03:38:03 crc kubenswrapper[4735]: I0317 03:38:03.899163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561978-gr75q" event={"ID":"9b80fdd5-0ad0-4d77-9eb0-1196019ca352","Type":"ContainerDied","Data":"b92d3082b49e865447277f99bdf275ebf1db17038dd2007e0f51dfd1408108b5"} Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.245047 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.329570 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gskpg\" (UniqueName: \"kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg\") pod \"9b80fdd5-0ad0-4d77-9eb0-1196019ca352\" (UID: \"9b80fdd5-0ad0-4d77-9eb0-1196019ca352\") " Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.340288 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg" (OuterVolumeSpecName: "kube-api-access-gskpg") pod "9b80fdd5-0ad0-4d77-9eb0-1196019ca352" (UID: "9b80fdd5-0ad0-4d77-9eb0-1196019ca352"). InnerVolumeSpecName "kube-api-access-gskpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.431639 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gskpg\" (UniqueName: \"kubernetes.io/projected/9b80fdd5-0ad0-4d77-9eb0-1196019ca352-kube-api-access-gskpg\") on node \"crc\" DevicePath \"\"" Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.921412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561978-gr75q" event={"ID":"9b80fdd5-0ad0-4d77-9eb0-1196019ca352","Type":"ContainerDied","Data":"5f02b9652a6dbdb57e1abcea9a2855ac3e281590c4c775c25431a00d31f22f86"} Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.921768 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f02b9652a6dbdb57e1abcea9a2855ac3e281590c4c775c25431a00d31f22f86" Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.921825 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561978-gr75q" Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.983837 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561972-dn6f6"] Mar 17 03:38:05 crc kubenswrapper[4735]: I0317 03:38:05.992273 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561972-dn6f6"] Mar 17 03:38:07 crc kubenswrapper[4735]: I0317 03:38:07.086025 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462c2898-109d-4525-aa2e-660175f723a7" path="/var/lib/kubelet/pods/462c2898-109d-4525-aa2e-660175f723a7/volumes" Mar 17 03:38:07 crc kubenswrapper[4735]: I0317 03:38:07.794206 4735 scope.go:117] "RemoveContainer" containerID="84ac71c1b2c299d75773d2c5a8cf5f04dc72fd7f651a318263d3db194296ee7a" Mar 17 03:38:09 crc kubenswrapper[4735]: I0317 03:38:09.074268 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:38:09 crc kubenswrapper[4735]: E0317 03:38:09.074739 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:38:21 crc kubenswrapper[4735]: I0317 03:38:21.073551 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:38:22 crc kubenswrapper[4735]: I0317 03:38:22.091735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172"} Mar 17 03:39:07 crc kubenswrapper[4735]: I0317 03:39:07.882424 4735 scope.go:117] "RemoveContainer" containerID="e0315d8cc0acd737c3a716b79e4694046e9f35af282f50d84fae5d08baa4f019" Mar 17 03:39:07 crc kubenswrapper[4735]: I0317 03:39:07.918908 4735 scope.go:117] "RemoveContainer" containerID="b2ece01e3e174a3c569ca365ae0ef4c40f8b17641e4065c5ae727e1934ef4e95" Mar 17 03:39:07 crc kubenswrapper[4735]: I0317 03:39:07.987716 4735 scope.go:117] "RemoveContainer" containerID="ce499c78bb1b55f5728934a95449c4dfe9d1b6969e8a7e5a991081415068ae1d" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.209510 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561980-7txfc"] Mar 17 03:40:00 crc kubenswrapper[4735]: E0317 03:40:00.210609 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b80fdd5-0ad0-4d77-9eb0-1196019ca352" containerName="oc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.210622 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b80fdd5-0ad0-4d77-9eb0-1196019ca352" containerName="oc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.210810 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b80fdd5-0ad0-4d77-9eb0-1196019ca352" containerName="oc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.215053 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.219900 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.220630 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.220679 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.299179 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561980-7txfc"] Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.399881 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5bj\" (UniqueName: \"kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj\") pod \"auto-csr-approver-29561980-7txfc\" (UID: \"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb\") " pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.502094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5bj\" (UniqueName: \"kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj\") pod \"auto-csr-approver-29561980-7txfc\" (UID: \"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb\") " pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.531203 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5bj\" (UniqueName: \"kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj\") pod \"auto-csr-approver-29561980-7txfc\" (UID: \"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb\") " pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:00 crc kubenswrapper[4735]: I0317 03:40:00.830302 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:01 crc kubenswrapper[4735]: I0317 03:40:01.572257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561980-7txfc"] Mar 17 03:40:01 crc kubenswrapper[4735]: I0317 03:40:01.763599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561980-7txfc" event={"ID":"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb","Type":"ContainerStarted","Data":"ee3db9fed0ba4aca0cf81441cd0a86f21810080fd1dfc73b0998af545401328e"} Mar 17 03:40:04 crc kubenswrapper[4735]: I0317 03:40:04.801028 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561980-7txfc" event={"ID":"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb","Type":"ContainerStarted","Data":"1d47c23cb06bb74db809ddc2f2e761a4719c66cff0b40e1d3e297f799295d28b"} Mar 17 03:40:04 crc kubenswrapper[4735]: I0317 03:40:04.818491 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561980-7txfc" podStartSLOduration=2.7388206569999998 podStartE2EDuration="4.818469822s" podCreationTimestamp="2026-03-17 03:40:00 +0000 UTC" firstStartedPulling="2026-03-17 03:40:01.569482388 +0000 UTC m=+9027.201715376" lastFinishedPulling="2026-03-17 03:40:03.649131543 +0000 UTC m=+9029.281364541" observedRunningTime="2026-03-17 03:40:04.812714673 +0000 UTC m=+9030.444947651" watchObservedRunningTime="2026-03-17 03:40:04.818469822 +0000 UTC m=+9030.450702800" Mar 17 03:40:05 crc kubenswrapper[4735]: I0317 03:40:05.817242 4735 generic.go:334] "Generic (PLEG): container finished" podID="ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" containerID="1d47c23cb06bb74db809ddc2f2e761a4719c66cff0b40e1d3e297f799295d28b" exitCode=0 Mar 17 03:40:05 crc kubenswrapper[4735]: I0317 03:40:05.817503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561980-7txfc" event={"ID":"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb","Type":"ContainerDied","Data":"1d47c23cb06bb74db809ddc2f2e761a4719c66cff0b40e1d3e297f799295d28b"} Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.245281 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.336091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg5bj\" (UniqueName: \"kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj\") pod \"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb\" (UID: \"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb\") " Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.351689 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj" (OuterVolumeSpecName: "kube-api-access-xg5bj") pod "ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" (UID: "ce0a40f8-f770-46e2-8c7a-13a0aa615eeb"). InnerVolumeSpecName "kube-api-access-xg5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.437953 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg5bj\" (UniqueName: \"kubernetes.io/projected/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb-kube-api-access-xg5bj\") on node \"crc\" DevicePath \"\"" Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.845629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561980-7txfc" event={"ID":"ce0a40f8-f770-46e2-8c7a-13a0aa615eeb","Type":"ContainerDied","Data":"ee3db9fed0ba4aca0cf81441cd0a86f21810080fd1dfc73b0998af545401328e"} Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.846323 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee3db9fed0ba4aca0cf81441cd0a86f21810080fd1dfc73b0998af545401328e" Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.845721 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561980-7txfc" Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.925143 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561974-b2d8b"] Mar 17 03:40:07 crc kubenswrapper[4735]: I0317 03:40:07.935430 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561974-b2d8b"] Mar 17 03:40:09 crc kubenswrapper[4735]: I0317 03:40:09.083730 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f425a620-cd23-4ee5-bb24-f192fe724b5d" path="/var/lib/kubelet/pods/f425a620-cd23-4ee5-bb24-f192fe724b5d/volumes" Mar 17 03:40:42 crc kubenswrapper[4735]: I0317 03:40:42.606169 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:40:42 crc kubenswrapper[4735]: I0317 03:40:42.608026 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:41:08 crc kubenswrapper[4735]: I0317 03:41:08.079042 4735 scope.go:117] "RemoveContainer" containerID="e41ef9bd9470039d831ebf409793d9082b4ac9b7c543ad8c784fee9fb8e241eb" Mar 17 03:41:12 crc kubenswrapper[4735]: I0317 03:41:12.606253 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:41:12 crc kubenswrapper[4735]: I0317 03:41:12.606792 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.606151 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.606926 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.607018 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.608279 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.608969 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172" gracePeriod=600 Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.937783 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172" exitCode=0 Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.937813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172"} Mar 17 03:41:42 crc kubenswrapper[4735]: I0317 03:41:42.938272 4735 scope.go:117] "RemoveContainer" containerID="391ddc1b4c75b396537a181d35527bf1092f46d7e75fc14c93c8058e368aecb5" Mar 17 03:41:43 crc kubenswrapper[4735]: I0317 03:41:43.949819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221"} Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.157309 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561982-nrdtt"] Mar 17 03:42:00 crc kubenswrapper[4735]: E0317 03:42:00.159044 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" containerName="oc" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.159118 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" containerName="oc" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.159360 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" containerName="oc" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.161196 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.167146 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.167182 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.168380 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.183028 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561982-nrdtt"] Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.235048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkhq\" (UniqueName: \"kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq\") pod \"auto-csr-approver-29561982-nrdtt\" (UID: \"75fba438-839c-408f-904e-5d7fc8c24a36\") " pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.338078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkhq\" (UniqueName: \"kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq\") pod \"auto-csr-approver-29561982-nrdtt\" (UID: \"75fba438-839c-408f-904e-5d7fc8c24a36\") " pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.360992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkhq\" (UniqueName: \"kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq\") pod \"auto-csr-approver-29561982-nrdtt\" (UID: \"75fba438-839c-408f-904e-5d7fc8c24a36\") " pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.482942 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:00 crc kubenswrapper[4735]: I0317 03:42:00.996226 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561982-nrdtt"] Mar 17 03:42:01 crc kubenswrapper[4735]: I0317 03:42:01.147139 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" event={"ID":"75fba438-839c-408f-904e-5d7fc8c24a36","Type":"ContainerStarted","Data":"29a65031d4546b40b893a2a5541fac23d44b8d6a27fb95b38900f5c18836ca29"} Mar 17 03:42:03 crc kubenswrapper[4735]: I0317 03:42:03.172088 4735 generic.go:334] "Generic (PLEG): container finished" podID="75fba438-839c-408f-904e-5d7fc8c24a36" containerID="62fe57647da1635ee9153850c5a40c50f18b7d9d480965b0c222281fecd9b38a" exitCode=0 Mar 17 03:42:03 crc kubenswrapper[4735]: I0317 03:42:03.172168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" event={"ID":"75fba438-839c-408f-904e-5d7fc8c24a36","Type":"ContainerDied","Data":"62fe57647da1635ee9153850c5a40c50f18b7d9d480965b0c222281fecd9b38a"} Mar 17 03:42:04 crc kubenswrapper[4735]: I0317 03:42:04.594812 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:04 crc kubenswrapper[4735]: I0317 03:42:04.734583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkhq\" (UniqueName: \"kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq\") pod \"75fba438-839c-408f-904e-5d7fc8c24a36\" (UID: \"75fba438-839c-408f-904e-5d7fc8c24a36\") " Mar 17 03:42:04 crc kubenswrapper[4735]: I0317 03:42:04.744699 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq" (OuterVolumeSpecName: "kube-api-access-slkhq") pod "75fba438-839c-408f-904e-5d7fc8c24a36" (UID: "75fba438-839c-408f-904e-5d7fc8c24a36"). InnerVolumeSpecName "kube-api-access-slkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:42:04 crc kubenswrapper[4735]: I0317 03:42:04.837874 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkhq\" (UniqueName: \"kubernetes.io/projected/75fba438-839c-408f-904e-5d7fc8c24a36-kube-api-access-slkhq\") on node \"crc\" DevicePath \"\"" Mar 17 03:42:05 crc kubenswrapper[4735]: I0317 03:42:05.194104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" event={"ID":"75fba438-839c-408f-904e-5d7fc8c24a36","Type":"ContainerDied","Data":"29a65031d4546b40b893a2a5541fac23d44b8d6a27fb95b38900f5c18836ca29"} Mar 17 03:42:05 crc kubenswrapper[4735]: I0317 03:42:05.194159 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a65031d4546b40b893a2a5541fac23d44b8d6a27fb95b38900f5c18836ca29" Mar 17 03:42:05 crc kubenswrapper[4735]: I0317 03:42:05.194239 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561982-nrdtt" Mar 17 03:42:05 crc kubenswrapper[4735]: I0317 03:42:05.675783 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561976-f6s7n"] Mar 17 03:42:05 crc kubenswrapper[4735]: I0317 03:42:05.682132 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561976-f6s7n"] Mar 17 03:42:07 crc kubenswrapper[4735]: I0317 03:42:07.083737 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41b877d-4fd6-46e3-bd90-801cbd9f5660" path="/var/lib/kubelet/pods/b41b877d-4fd6-46e3-bd90-801cbd9f5660/volumes" Mar 17 03:42:08 crc kubenswrapper[4735]: I0317 03:42:08.193399 4735 scope.go:117] "RemoveContainer" containerID="545c20ca69af9b5255e18d61d942a61b8d9cbc96cc34e9975824abeec06f4856" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.859148 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:43:12 crc kubenswrapper[4735]: E0317 03:43:12.860624 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fba438-839c-408f-904e-5d7fc8c24a36" containerName="oc" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.860652 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fba438-839c-408f-904e-5d7fc8c24a36" containerName="oc" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.861142 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fba438-839c-408f-904e-5d7fc8c24a36" containerName="oc" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.867860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.875598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.997807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.998007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:12 crc kubenswrapper[4735]: I0317 03:43:12.998053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ncf\" (UniqueName: \"kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.100546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.100636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ncf\" (UniqueName: \"kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.100923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.101459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.101974 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.149564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ncf\" (UniqueName: \"kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf\") pod \"redhat-operators-2hmjr\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.205954 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.736313 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:43:13 crc kubenswrapper[4735]: I0317 03:43:13.908010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerStarted","Data":"5abc22f8e4667c9280884280c9eee5e527b5443f954a05db9b79c8509208e022"} Mar 17 03:43:14 crc kubenswrapper[4735]: E0317 03:43:14.179276 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09cf394_6290_433f_9ee4_404aa2723ced.slice/crio-conmon-5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09cf394_6290_433f_9ee4_404aa2723ced.slice/crio-5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f.scope\": RecentStats: unable to find data in memory cache]" Mar 17 03:43:14 crc kubenswrapper[4735]: I0317 03:43:14.917013 4735 generic.go:334] "Generic (PLEG): container finished" podID="a09cf394-6290-433f-9ee4-404aa2723ced" containerID="5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f" exitCode=0 Mar 17 03:43:14 crc kubenswrapper[4735]: I0317 03:43:14.917104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerDied","Data":"5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f"} Mar 17 03:43:14 crc kubenswrapper[4735]: I0317 03:43:14.919026 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:43:15 crc kubenswrapper[4735]: I0317 03:43:15.928104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerStarted","Data":"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99"} Mar 17 03:43:21 crc kubenswrapper[4735]: I0317 03:43:21.987983 4735 generic.go:334] "Generic (PLEG): container finished" podID="a09cf394-6290-433f-9ee4-404aa2723ced" containerID="b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99" exitCode=0 Mar 17 03:43:21 crc kubenswrapper[4735]: I0317 03:43:21.988047 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerDied","Data":"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99"} Mar 17 03:43:23 crc kubenswrapper[4735]: I0317 03:43:23.000533 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerStarted","Data":"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c"} Mar 17 03:43:23 crc kubenswrapper[4735]: I0317 03:43:23.030891 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hmjr" podStartSLOduration=3.356534033 podStartE2EDuration="11.030872353s" podCreationTimestamp="2026-03-17 03:43:12 +0000 UTC" firstStartedPulling="2026-03-17 03:43:14.918819331 +0000 UTC m=+9220.551052309" lastFinishedPulling="2026-03-17 03:43:22.593157651 +0000 UTC m=+9228.225390629" observedRunningTime="2026-03-17 03:43:23.026982468 +0000 UTC m=+9228.659215436" watchObservedRunningTime="2026-03-17 03:43:23.030872353 +0000 UTC m=+9228.663105331" Mar 17 03:43:23 crc kubenswrapper[4735]: I0317 03:43:23.206851 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:23 crc kubenswrapper[4735]: I0317 03:43:23.207138 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:43:24 crc kubenswrapper[4735]: I0317 03:43:24.258299 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hmjr" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" probeResult="failure" output=< Mar 17 03:43:24 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:43:24 crc kubenswrapper[4735]: > Mar 17 03:43:34 crc kubenswrapper[4735]: I0317 03:43:34.261883 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hmjr" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" probeResult="failure" output=< Mar 17 03:43:34 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:43:34 crc kubenswrapper[4735]: > Mar 17 03:43:44 crc kubenswrapper[4735]: I0317 03:43:44.251950 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hmjr" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" probeResult="failure" output=< Mar 17 03:43:44 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:43:44 crc kubenswrapper[4735]: > Mar 17 03:43:54 crc kubenswrapper[4735]: I0317 03:43:54.281187 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hmjr" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" probeResult="failure" output=< Mar 17 03:43:54 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:43:54 crc kubenswrapper[4735]: > Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.153798 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561984-8rqhm"] Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.156770 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.159210 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.159235 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.161209 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.164669 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561984-8rqhm"] Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.281799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2d4\" (UniqueName: \"kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4\") pod \"auto-csr-approver-29561984-8rqhm\" (UID: \"713eefe9-a95c-4145-8b76-a6f05c7ff4f0\") " pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.384010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2d4\" (UniqueName: \"kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4\") pod \"auto-csr-approver-29561984-8rqhm\" (UID: \"713eefe9-a95c-4145-8b76-a6f05c7ff4f0\") " pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.409011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2d4\" (UniqueName: \"kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4\") pod \"auto-csr-approver-29561984-8rqhm\" (UID: \"713eefe9-a95c-4145-8b76-a6f05c7ff4f0\") " pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:00 crc kubenswrapper[4735]: I0317 03:44:00.492451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:01 crc kubenswrapper[4735]: I0317 03:44:01.317363 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561984-8rqhm"] Mar 17 03:44:01 crc kubenswrapper[4735]: W0317 03:44:01.325289 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod713eefe9_a95c_4145_8b76_a6f05c7ff4f0.slice/crio-b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b WatchSource:0}: Error finding container b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b: Status 404 returned error can't find the container with id b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b Mar 17 03:44:01 crc kubenswrapper[4735]: I0317 03:44:01.337405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" event={"ID":"713eefe9-a95c-4145-8b76-a6f05c7ff4f0","Type":"ContainerStarted","Data":"b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b"} Mar 17 03:44:03 crc kubenswrapper[4735]: I0317 03:44:03.266358 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:44:03 crc kubenswrapper[4735]: I0317 03:44:03.327048 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:44:03 crc kubenswrapper[4735]: I0317 03:44:03.371583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" event={"ID":"713eefe9-a95c-4145-8b76-a6f05c7ff4f0","Type":"ContainerStarted","Data":"b7771e7521699c4f1384364a227b1605bd78b86ef4280a8d139b67395fb2154d"} Mar 17 03:44:03 crc kubenswrapper[4735]: I0317 03:44:03.391400 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" podStartSLOduration=2.430290045 podStartE2EDuration="3.39137613s" podCreationTimestamp="2026-03-17 03:44:00 +0000 UTC" firstStartedPulling="2026-03-17 03:44:01.326371398 +0000 UTC m=+9266.958604376" lastFinishedPulling="2026-03-17 03:44:02.287457443 +0000 UTC m=+9267.919690461" observedRunningTime="2026-03-17 03:44:03.385263702 +0000 UTC m=+9269.017496680" watchObservedRunningTime="2026-03-17 03:44:03.39137613 +0000 UTC m=+9269.023609098" Mar 17 03:44:03 crc kubenswrapper[4735]: I0317 03:44:03.505770 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:44:04 crc kubenswrapper[4735]: I0317 03:44:04.382808 4735 generic.go:334] "Generic (PLEG): container finished" podID="713eefe9-a95c-4145-8b76-a6f05c7ff4f0" containerID="b7771e7521699c4f1384364a227b1605bd78b86ef4280a8d139b67395fb2154d" exitCode=0 Mar 17 03:44:04 crc kubenswrapper[4735]: I0317 03:44:04.382889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" event={"ID":"713eefe9-a95c-4145-8b76-a6f05c7ff4f0","Type":"ContainerDied","Data":"b7771e7521699c4f1384364a227b1605bd78b86ef4280a8d139b67395fb2154d"} Mar 17 03:44:04 crc kubenswrapper[4735]: I0317 03:44:04.383054 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hmjr" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" containerID="cri-o://0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c" gracePeriod=2 Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.166367 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.338266 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content\") pod \"a09cf394-6290-433f-9ee4-404aa2723ced\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.338393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ncf\" (UniqueName: \"kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf\") pod \"a09cf394-6290-433f-9ee4-404aa2723ced\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.338434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities\") pod \"a09cf394-6290-433f-9ee4-404aa2723ced\" (UID: \"a09cf394-6290-433f-9ee4-404aa2723ced\") " Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.340191 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities" (OuterVolumeSpecName: "utilities") pod "a09cf394-6290-433f-9ee4-404aa2723ced" (UID: "a09cf394-6290-433f-9ee4-404aa2723ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.356690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf" (OuterVolumeSpecName: "kube-api-access-r8ncf") pod "a09cf394-6290-433f-9ee4-404aa2723ced" (UID: "a09cf394-6290-433f-9ee4-404aa2723ced"). InnerVolumeSpecName "kube-api-access-r8ncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.396369 4735 generic.go:334] "Generic (PLEG): container finished" podID="a09cf394-6290-433f-9ee4-404aa2723ced" containerID="0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c" exitCode=0 Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.396715 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hmjr" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.396714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerDied","Data":"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c"} Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.398080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hmjr" event={"ID":"a09cf394-6290-433f-9ee4-404aa2723ced","Type":"ContainerDied","Data":"5abc22f8e4667c9280884280c9eee5e527b5443f954a05db9b79c8509208e022"} Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.398152 4735 scope.go:117] "RemoveContainer" containerID="0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.433020 4735 scope.go:117] "RemoveContainer" containerID="b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.442410 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ncf\" (UniqueName: \"kubernetes.io/projected/a09cf394-6290-433f-9ee4-404aa2723ced-kube-api-access-r8ncf\") on node \"crc\" DevicePath \"\"" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.442619 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.461390 4735 scope.go:117] "RemoveContainer" containerID="5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.490533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a09cf394-6290-433f-9ee4-404aa2723ced" (UID: "a09cf394-6290-433f-9ee4-404aa2723ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.518048 4735 scope.go:117] "RemoveContainer" containerID="0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c" Mar 17 03:44:05 crc kubenswrapper[4735]: E0317 03:44:05.542463 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c\": container with ID starting with 0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c not found: ID does not exist" containerID="0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.542533 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c"} err="failed to get container status \"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c\": rpc error: code = NotFound desc = could not find container \"0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c\": container with ID starting with 0ac2cc15f46cf7ffbc3978c4b7c2a24dd5b3d3d666469fc88233d49eab51662c not found: ID does not exist" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.542565 4735 scope.go:117] "RemoveContainer" containerID="b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99" Mar 17 03:44:05 crc kubenswrapper[4735]: E0317 03:44:05.542964 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99\": container with ID starting with b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99 not found: ID does not exist" containerID="b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.543028 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99"} err="failed to get container status \"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99\": rpc error: code = NotFound desc = could not find container \"b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99\": container with ID starting with b5332db2669a094cc059b6dd5c3866cf94add97686129f66d3df3c2ae6870a99 not found: ID does not exist" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.543048 4735 scope.go:117] "RemoveContainer" containerID="5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f" Mar 17 03:44:05 crc kubenswrapper[4735]: E0317 03:44:05.543367 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f\": container with ID starting with 5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f not found: ID does not exist" containerID="5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.543412 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f"} err="failed to get container status \"5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f\": rpc error: code = NotFound desc = could not find container \"5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f\": container with ID starting with 5b25f481236a8dc4830dd92197f58ef5cedf06b1e00d14433681c1db9fc1c54f not found: ID does not exist" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.543469 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09cf394-6290-433f-9ee4-404aa2723ced-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.775119 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.796411 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.811092 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hmjr"] Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.847100 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l2d4\" (UniqueName: \"kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4\") pod \"713eefe9-a95c-4145-8b76-a6f05c7ff4f0\" (UID: \"713eefe9-a95c-4145-8b76-a6f05c7ff4f0\") " Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.853545 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4" (OuterVolumeSpecName: "kube-api-access-7l2d4") pod "713eefe9-a95c-4145-8b76-a6f05c7ff4f0" (UID: "713eefe9-a95c-4145-8b76-a6f05c7ff4f0"). InnerVolumeSpecName "kube-api-access-7l2d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:44:05 crc kubenswrapper[4735]: I0317 03:44:05.948361 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l2d4\" (UniqueName: \"kubernetes.io/projected/713eefe9-a95c-4145-8b76-a6f05c7ff4f0-kube-api-access-7l2d4\") on node \"crc\" DevicePath \"\"" Mar 17 03:44:06 crc kubenswrapper[4735]: I0317 03:44:06.411270 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" Mar 17 03:44:06 crc kubenswrapper[4735]: I0317 03:44:06.411266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561984-8rqhm" event={"ID":"713eefe9-a95c-4145-8b76-a6f05c7ff4f0","Type":"ContainerDied","Data":"b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b"} Mar 17 03:44:06 crc kubenswrapper[4735]: I0317 03:44:06.411439 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22c630b56565ff5f39926e1b11e0f18def2aeae3dad97d958babef36611371b" Mar 17 03:44:06 crc kubenswrapper[4735]: I0317 03:44:06.498488 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561978-gr75q"] Mar 17 03:44:06 crc kubenswrapper[4735]: I0317 03:44:06.508037 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561978-gr75q"] Mar 17 03:44:07 crc kubenswrapper[4735]: I0317 03:44:07.087386 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b80fdd5-0ad0-4d77-9eb0-1196019ca352" path="/var/lib/kubelet/pods/9b80fdd5-0ad0-4d77-9eb0-1196019ca352/volumes" Mar 17 03:44:07 crc kubenswrapper[4735]: I0317 03:44:07.088972 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" path="/var/lib/kubelet/pods/a09cf394-6290-433f-9ee4-404aa2723ced/volumes" Mar 17 03:44:08 crc kubenswrapper[4735]: I0317 03:44:08.310365 4735 scope.go:117] "RemoveContainer" containerID="b92d3082b49e865447277f99bdf275ebf1db17038dd2007e0f51dfd1408108b5" Mar 17 03:44:12 crc kubenswrapper[4735]: I0317 03:44:12.606247 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:44:12 crc kubenswrapper[4735]: I0317 03:44:12.607511 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:44:42 crc kubenswrapper[4735]: I0317 03:44:42.607252 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:44:42 crc kubenswrapper[4735]: I0317 03:44:42.607930 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.159967 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq"] Mar 17 03:45:00 crc kubenswrapper[4735]: E0317 03:45:00.160906 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.160922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" Mar 17 03:45:00 crc kubenswrapper[4735]: E0317 03:45:00.160968 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="extract-utilities" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.160977 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="extract-utilities" Mar 17 03:45:00 crc kubenswrapper[4735]: E0317 03:45:00.160990 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713eefe9-a95c-4145-8b76-a6f05c7ff4f0" containerName="oc" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.160998 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="713eefe9-a95c-4145-8b76-a6f05c7ff4f0" containerName="oc" Mar 17 03:45:00 crc kubenswrapper[4735]: E0317 03:45:00.161015 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="extract-content" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.161023 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="extract-content" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.161231 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09cf394-6290-433f-9ee4-404aa2723ced" containerName="registry-server" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.161263 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="713eefe9-a95c-4145-8b76-a6f05c7ff4f0" containerName="oc" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.161974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.164734 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.168228 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq"] Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.173168 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.269400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhw4\" (UniqueName: \"kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.269819 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.269929 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.371405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.371511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhw4\" (UniqueName: \"kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.371577 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.372267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.379487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.394549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhw4\" (UniqueName: \"kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4\") pod \"collect-profiles-29561985-5x5nq\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.485433 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:00 crc kubenswrapper[4735]: I0317 03:45:00.961396 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq"] Mar 17 03:45:01 crc kubenswrapper[4735]: I0317 03:45:01.980346 4735 generic.go:334] "Generic (PLEG): container finished" podID="e632db0e-807f-428b-ad2d-e842c3ef7d15" containerID="28b14f41c59197f152fce67a9937502a882cea71b935f9cf4c5afb1cc2365cc4" exitCode=0 Mar 17 03:45:01 crc kubenswrapper[4735]: I0317 03:45:01.980415 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" event={"ID":"e632db0e-807f-428b-ad2d-e842c3ef7d15","Type":"ContainerDied","Data":"28b14f41c59197f152fce67a9937502a882cea71b935f9cf4c5afb1cc2365cc4"} Mar 17 03:45:01 crc kubenswrapper[4735]: I0317 03:45:01.980598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" event={"ID":"e632db0e-807f-428b-ad2d-e842c3ef7d15","Type":"ContainerStarted","Data":"2fd967a829744ec515bda96dc146837fcfc1a52802dc3e5948015e94e4b56564"} Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.350968 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.433531 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhw4\" (UniqueName: \"kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4\") pod \"e632db0e-807f-428b-ad2d-e842c3ef7d15\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.433786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume\") pod \"e632db0e-807f-428b-ad2d-e842c3ef7d15\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.433917 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume\") pod \"e632db0e-807f-428b-ad2d-e842c3ef7d15\" (UID: \"e632db0e-807f-428b-ad2d-e842c3ef7d15\") " Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.434839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume" (OuterVolumeSpecName: "config-volume") pod "e632db0e-807f-428b-ad2d-e842c3ef7d15" (UID: "e632db0e-807f-428b-ad2d-e842c3ef7d15"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.439829 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e632db0e-807f-428b-ad2d-e842c3ef7d15" (UID: "e632db0e-807f-428b-ad2d-e842c3ef7d15"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.443419 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4" (OuterVolumeSpecName: "kube-api-access-dqhw4") pod "e632db0e-807f-428b-ad2d-e842c3ef7d15" (UID: "e632db0e-807f-428b-ad2d-e842c3ef7d15"). InnerVolumeSpecName "kube-api-access-dqhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.536471 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e632db0e-807f-428b-ad2d-e842c3ef7d15-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.536704 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhw4\" (UniqueName: \"kubernetes.io/projected/e632db0e-807f-428b-ad2d-e842c3ef7d15-kube-api-access-dqhw4\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:03 crc kubenswrapper[4735]: I0317 03:45:03.536781 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e632db0e-807f-428b-ad2d-e842c3ef7d15-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:04 crc kubenswrapper[4735]: I0317 03:45:04.000928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" event={"ID":"e632db0e-807f-428b-ad2d-e842c3ef7d15","Type":"ContainerDied","Data":"2fd967a829744ec515bda96dc146837fcfc1a52802dc3e5948015e94e4b56564"} Mar 17 03:45:04 crc kubenswrapper[4735]: I0317 03:45:04.000985 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd967a829744ec515bda96dc146837fcfc1a52802dc3e5948015e94e4b56564" Mar 17 03:45:04 crc kubenswrapper[4735]: I0317 03:45:04.000955 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq" Mar 17 03:45:04 crc kubenswrapper[4735]: I0317 03:45:04.432776 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499"] Mar 17 03:45:04 crc kubenswrapper[4735]: I0317 03:45:04.451176 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561940-fh499"] Mar 17 03:45:05 crc kubenswrapper[4735]: I0317 03:45:05.084739 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d58c75f-83cc-4cc0-8a39-8f58353bbd0e" path="/var/lib/kubelet/pods/9d58c75f-83cc-4cc0-8a39-8f58353bbd0e/volumes" Mar 17 03:45:08 crc kubenswrapper[4735]: I0317 03:45:08.457365 4735 scope.go:117] "RemoveContainer" containerID="3dc99e2b4d331318bddc8f10fcf4261b7e8faed32ca6f024442a00de13af397c" Mar 17 03:45:11 crc kubenswrapper[4735]: I0317 03:45:11.981105 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpdb8"] Mar 17 03:45:11 crc kubenswrapper[4735]: E0317 03:45:11.982125 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e632db0e-807f-428b-ad2d-e842c3ef7d15" containerName="collect-profiles" Mar 17 03:45:11 crc kubenswrapper[4735]: I0317 03:45:11.982150 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e632db0e-807f-428b-ad2d-e842c3ef7d15" containerName="collect-profiles" Mar 17 03:45:11 crc kubenswrapper[4735]: I0317 03:45:11.982397 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e632db0e-807f-428b-ad2d-e842c3ef7d15" containerName="collect-profiles" Mar 17 03:45:11 crc kubenswrapper[4735]: I0317 03:45:11.984007 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.004804 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpdb8"] Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.115466 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-catalog-content\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.116346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz76f\" (UniqueName: \"kubernetes.io/projected/bf62916f-f358-4caa-9c7e-37527c4610d3-kube-api-access-qz76f\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.116660 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-utilities\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.218696 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-utilities\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.219130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-catalog-content\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.219240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz76f\" (UniqueName: \"kubernetes.io/projected/bf62916f-f358-4caa-9c7e-37527c4610d3-kube-api-access-qz76f\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.219468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-utilities\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.219679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf62916f-f358-4caa-9c7e-37527c4610d3-catalog-content\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.245079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz76f\" (UniqueName: \"kubernetes.io/projected/bf62916f-f358-4caa-9c7e-37527c4610d3-kube-api-access-qz76f\") pod \"community-operators-qpdb8\" (UID: \"bf62916f-f358-4caa-9c7e-37527c4610d3\") " pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.320936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.606267 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.606333 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.606380 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.607194 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.607593 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" gracePeriod=600 Mar 17 03:45:12 crc kubenswrapper[4735]: E0317 03:45:12.738641 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:45:12 crc kubenswrapper[4735]: I0317 03:45:12.857191 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpdb8"] Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.107094 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf62916f-f358-4caa-9c7e-37527c4610d3" containerID="6bc696e849e53b1d1e1dbd4172ece1c90d2d9c037d4da32cea32e93c059c3362" exitCode=0 Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.107201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpdb8" event={"ID":"bf62916f-f358-4caa-9c7e-37527c4610d3","Type":"ContainerDied","Data":"6bc696e849e53b1d1e1dbd4172ece1c90d2d9c037d4da32cea32e93c059c3362"} Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.107226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpdb8" event={"ID":"bf62916f-f358-4caa-9c7e-37527c4610d3","Type":"ContainerStarted","Data":"9c5a2608acb964c8685645f4fc707b761509655f3596e73383bf4526a999ace7"} Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.110585 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" exitCode=0 Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.110649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221"} Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.111553 4735 scope.go:117] "RemoveContainer" containerID="2d744d5f50a3568187c8579fc0bcf095addb71076d6921ec818ecc64259e4172" Mar 17 03:45:13 crc kubenswrapper[4735]: I0317 03:45:13.112263 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:45:13 crc kubenswrapper[4735]: E0317 03:45:13.112513 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:45:19 crc kubenswrapper[4735]: I0317 03:45:19.283753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpdb8" event={"ID":"bf62916f-f358-4caa-9c7e-37527c4610d3","Type":"ContainerStarted","Data":"cfefa08a7ef2b7741bb5e5eac0da4f5037830d5ca6ca162436b68310a22f39df"} Mar 17 03:45:20 crc kubenswrapper[4735]: I0317 03:45:20.296444 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf62916f-f358-4caa-9c7e-37527c4610d3" containerID="cfefa08a7ef2b7741bb5e5eac0da4f5037830d5ca6ca162436b68310a22f39df" exitCode=0 Mar 17 03:45:20 crc kubenswrapper[4735]: I0317 03:45:20.296493 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpdb8" event={"ID":"bf62916f-f358-4caa-9c7e-37527c4610d3","Type":"ContainerDied","Data":"cfefa08a7ef2b7741bb5e5eac0da4f5037830d5ca6ca162436b68310a22f39df"} Mar 17 03:45:21 crc kubenswrapper[4735]: I0317 03:45:21.310685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpdb8" event={"ID":"bf62916f-f358-4caa-9c7e-37527c4610d3","Type":"ContainerStarted","Data":"cde8ccb8efdf3b2880ba9adf20cd14efa7645cd125609e732242e0ab261a88c6"} Mar 17 03:45:21 crc kubenswrapper[4735]: I0317 03:45:21.349789 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpdb8" podStartSLOduration=2.716905789 podStartE2EDuration="10.349761656s" podCreationTimestamp="2026-03-17 03:45:11 +0000 UTC" firstStartedPulling="2026-03-17 03:45:13.108706006 +0000 UTC m=+9338.740938984" lastFinishedPulling="2026-03-17 03:45:20.741561873 +0000 UTC m=+9346.373794851" observedRunningTime="2026-03-17 03:45:21.332424427 +0000 UTC m=+9346.964657425" watchObservedRunningTime="2026-03-17 03:45:21.349761656 +0000 UTC m=+9346.981994664" Mar 17 03:45:22 crc kubenswrapper[4735]: I0317 03:45:22.320071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:22 crc kubenswrapper[4735]: I0317 03:45:22.321208 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:23 crc kubenswrapper[4735]: I0317 03:45:23.405087 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qpdb8" podUID="bf62916f-f358-4caa-9c7e-37527c4610d3" containerName="registry-server" probeResult="failure" output=< Mar 17 03:45:23 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:45:23 crc kubenswrapper[4735]: > Mar 17 03:45:24 crc kubenswrapper[4735]: I0317 03:45:24.073148 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:45:24 crc kubenswrapper[4735]: E0317 03:45:24.073672 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:45:32 crc kubenswrapper[4735]: I0317 03:45:32.378684 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:32 crc kubenswrapper[4735]: I0317 03:45:32.440614 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpdb8" Mar 17 03:45:32 crc kubenswrapper[4735]: I0317 03:45:32.528736 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpdb8"] Mar 17 03:45:32 crc kubenswrapper[4735]: I0317 03:45:32.621075 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 03:45:32 crc kubenswrapper[4735]: I0317 03:45:32.621542 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jxvlc" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="registry-server" containerID="cri-o://af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892" gracePeriod=2 Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.289695 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.435125 4735 generic.go:334] "Generic (PLEG): container finished" podID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerID="af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892" exitCode=0 Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.435173 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxvlc" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.435225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerDied","Data":"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892"} Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.435248 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxvlc" event={"ID":"ffd75092-d1c3-4e3b-b19b-26a6c94fab40","Type":"ContainerDied","Data":"8b580c67c2de3bddfcc304dd4792aa80aba4dcb300bcf59439fb7c840e33a5a9"} Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.435283 4735 scope.go:117] "RemoveContainer" containerID="af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.465065 4735 scope.go:117] "RemoveContainer" containerID="b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.475020 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities\") pod \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.475102 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content\") pod \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.475169 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlm29\" (UniqueName: \"kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29\") pod \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\" (UID: \"ffd75092-d1c3-4e3b-b19b-26a6c94fab40\") " Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.475586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities" (OuterVolumeSpecName: "utilities") pod "ffd75092-d1c3-4e3b-b19b-26a6c94fab40" (UID: "ffd75092-d1c3-4e3b-b19b-26a6c94fab40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.517160 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29" (OuterVolumeSpecName: "kube-api-access-xlm29") pod "ffd75092-d1c3-4e3b-b19b-26a6c94fab40" (UID: "ffd75092-d1c3-4e3b-b19b-26a6c94fab40"). InnerVolumeSpecName "kube-api-access-xlm29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.522107 4735 scope.go:117] "RemoveContainer" containerID="3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.552963 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffd75092-d1c3-4e3b-b19b-26a6c94fab40" (UID: "ffd75092-d1c3-4e3b-b19b-26a6c94fab40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.576795 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.576823 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlm29\" (UniqueName: \"kubernetes.io/projected/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-kube-api-access-xlm29\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.576832 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd75092-d1c3-4e3b-b19b-26a6c94fab40-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.581215 4735 scope.go:117] "RemoveContainer" containerID="af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892" Mar 17 03:45:33 crc kubenswrapper[4735]: E0317 03:45:33.581701 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892\": container with ID starting with af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892 not found: ID does not exist" containerID="af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.581932 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892"} err="failed to get container status \"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892\": rpc error: code = NotFound desc = could not find container \"af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892\": container with ID starting with af13f122c40191809c6231177e8b70ef1d339ceb83432657794c6f411e939892 not found: ID does not exist" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.581955 4735 scope.go:117] "RemoveContainer" containerID="b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7" Mar 17 03:45:33 crc kubenswrapper[4735]: E0317 03:45:33.582288 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7\": container with ID starting with b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7 not found: ID does not exist" containerID="b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.582313 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7"} err="failed to get container status \"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7\": rpc error: code = NotFound desc = could not find container \"b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7\": container with ID starting with b3c52b6d29f48cc354b0ab0faac892bfa868ef5580cdcb24f3f51b8c5bc7dcf7 not found: ID does not exist" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.582329 4735 scope.go:117] "RemoveContainer" containerID="3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830" Mar 17 03:45:33 crc kubenswrapper[4735]: E0317 03:45:33.582816 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830\": container with ID starting with 3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830 not found: ID does not exist" containerID="3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.582842 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830"} err="failed to get container status \"3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830\": rpc error: code = NotFound desc = could not find container \"3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830\": container with ID starting with 3c37fbc58fea0f1a5f53a77d8afcca4ca3574647f88af640eb106b5b28f4d830 not found: ID does not exist" Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.796442 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 03:45:33 crc kubenswrapper[4735]: I0317 03:45:33.804785 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jxvlc"] Mar 17 03:45:35 crc kubenswrapper[4735]: I0317 03:45:35.085332 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" path="/var/lib/kubelet/pods/ffd75092-d1c3-4e3b-b19b-26a6c94fab40/volumes" Mar 17 03:45:37 crc kubenswrapper[4735]: I0317 03:45:37.073571 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:45:37 crc kubenswrapper[4735]: E0317 03:45:37.074246 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:45:50 crc kubenswrapper[4735]: I0317 03:45:50.073962 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:45:50 crc kubenswrapper[4735]: E0317 03:45:50.075071 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.177174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561986-jsp2s"] Mar 17 03:46:00 crc kubenswrapper[4735]: E0317 03:46:00.177963 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="registry-server" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.177975 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="registry-server" Mar 17 03:46:00 crc kubenswrapper[4735]: E0317 03:46:00.177990 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="extract-content" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.177995 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="extract-content" Mar 17 03:46:00 crc kubenswrapper[4735]: E0317 03:46:00.178025 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="extract-utilities" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.178033 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="extract-utilities" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.178196 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd75092-d1c3-4e3b-b19b-26a6c94fab40" containerName="registry-server" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.178785 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.181907 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.185523 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.185703 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.194394 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561986-jsp2s"] Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.254212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzr2b\" (UniqueName: \"kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b\") pod \"auto-csr-approver-29561986-jsp2s\" (UID: \"26493eba-11d9-47d0-9eaa-079f1d3451c1\") " pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.356301 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzr2b\" (UniqueName: \"kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b\") pod \"auto-csr-approver-29561986-jsp2s\" (UID: \"26493eba-11d9-47d0-9eaa-079f1d3451c1\") " pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.392617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzr2b\" (UniqueName: \"kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b\") pod \"auto-csr-approver-29561986-jsp2s\" (UID: \"26493eba-11d9-47d0-9eaa-079f1d3451c1\") " pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:00 crc kubenswrapper[4735]: I0317 03:46:00.500845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:01 crc kubenswrapper[4735]: I0317 03:46:01.017698 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561986-jsp2s"] Mar 17 03:46:01 crc kubenswrapper[4735]: I0317 03:46:01.723422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" event={"ID":"26493eba-11d9-47d0-9eaa-079f1d3451c1","Type":"ContainerStarted","Data":"24e7569d1795d9e3e7266c3c2107466164c9f766066c9f4ae3aa8dab690b93ed"} Mar 17 03:46:02 crc kubenswrapper[4735]: I0317 03:46:02.737838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" event={"ID":"26493eba-11d9-47d0-9eaa-079f1d3451c1","Type":"ContainerStarted","Data":"c44f11870b846c086e7b8a0331956b74057e7f1930707791074d110de671bb51"} Mar 17 03:46:02 crc kubenswrapper[4735]: I0317 03:46:02.777333 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" podStartSLOduration=1.496437646 podStartE2EDuration="2.777313531s" podCreationTimestamp="2026-03-17 03:46:00 +0000 UTC" firstStartedPulling="2026-03-17 03:46:01.021451313 +0000 UTC m=+9386.653684291" lastFinishedPulling="2026-03-17 03:46:02.302327188 +0000 UTC m=+9387.934560176" observedRunningTime="2026-03-17 03:46:02.766369626 +0000 UTC m=+9388.398602614" watchObservedRunningTime="2026-03-17 03:46:02.777313531 +0000 UTC m=+9388.409546519" Mar 17 03:46:03 crc kubenswrapper[4735]: I0317 03:46:03.750614 4735 generic.go:334] "Generic (PLEG): container finished" podID="26493eba-11d9-47d0-9eaa-079f1d3451c1" containerID="c44f11870b846c086e7b8a0331956b74057e7f1930707791074d110de671bb51" exitCode=0 Mar 17 03:46:03 crc kubenswrapper[4735]: I0317 03:46:03.750666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" event={"ID":"26493eba-11d9-47d0-9eaa-079f1d3451c1","Type":"ContainerDied","Data":"c44f11870b846c086e7b8a0331956b74057e7f1930707791074d110de671bb51"} Mar 17 03:46:04 crc kubenswrapper[4735]: I0317 03:46:04.074154 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:46:04 crc kubenswrapper[4735]: E0317 03:46:04.074448 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.170524 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.257220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzr2b\" (UniqueName: \"kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b\") pod \"26493eba-11d9-47d0-9eaa-079f1d3451c1\" (UID: \"26493eba-11d9-47d0-9eaa-079f1d3451c1\") " Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.280067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b" (OuterVolumeSpecName: "kube-api-access-kzr2b") pod "26493eba-11d9-47d0-9eaa-079f1d3451c1" (UID: "26493eba-11d9-47d0-9eaa-079f1d3451c1"). InnerVolumeSpecName "kube-api-access-kzr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.360015 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzr2b\" (UniqueName: \"kubernetes.io/projected/26493eba-11d9-47d0-9eaa-079f1d3451c1-kube-api-access-kzr2b\") on node \"crc\" DevicePath \"\"" Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.772646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" event={"ID":"26493eba-11d9-47d0-9eaa-079f1d3451c1","Type":"ContainerDied","Data":"24e7569d1795d9e3e7266c3c2107466164c9f766066c9f4ae3aa8dab690b93ed"} Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.772685 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e7569d1795d9e3e7266c3c2107466164c9f766066c9f4ae3aa8dab690b93ed" Mar 17 03:46:05 crc kubenswrapper[4735]: I0317 03:46:05.772739 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561986-jsp2s" Mar 17 03:46:06 crc kubenswrapper[4735]: I0317 03:46:06.253389 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561980-7txfc"] Mar 17 03:46:06 crc kubenswrapper[4735]: I0317 03:46:06.262085 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561980-7txfc"] Mar 17 03:46:07 crc kubenswrapper[4735]: I0317 03:46:07.082792 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0a40f8-f770-46e2-8c7a-13a0aa615eeb" path="/var/lib/kubelet/pods/ce0a40f8-f770-46e2-8c7a-13a0aa615eeb/volumes" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.123606 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:08 crc kubenswrapper[4735]: E0317 03:46:08.124974 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26493eba-11d9-47d0-9eaa-079f1d3451c1" containerName="oc" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.125074 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="26493eba-11d9-47d0-9eaa-079f1d3451c1" containerName="oc" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.125465 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="26493eba-11d9-47d0-9eaa-079f1d3451c1" containerName="oc" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.127414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.196684 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.210041 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.210100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cns\" (UniqueName: \"kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.210492 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.311950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.312015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cns\" (UniqueName: \"kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.312113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.312473 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.312496 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.337515 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cns\" (UniqueName: \"kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns\") pod \"redhat-marketplace-64bfk\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.470732 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:08 crc kubenswrapper[4735]: I0317 03:46:08.524785 4735 scope.go:117] "RemoveContainer" containerID="1d47c23cb06bb74db809ddc2f2e761a4719c66cff0b40e1d3e297f799295d28b" Mar 17 03:46:09 crc kubenswrapper[4735]: I0317 03:46:09.016227 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:09 crc kubenswrapper[4735]: I0317 03:46:09.812165 4735 generic.go:334] "Generic (PLEG): container finished" podID="d73b84d3-3133-41a0-aa99-24876242b58d" containerID="f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e" exitCode=0 Mar 17 03:46:09 crc kubenswrapper[4735]: I0317 03:46:09.812425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerDied","Data":"f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e"} Mar 17 03:46:09 crc kubenswrapper[4735]: I0317 03:46:09.812457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerStarted","Data":"426a532d5b1e1669a72d841a91edd198a55233753f9c652c5596f98af55e68d4"} Mar 17 03:46:10 crc kubenswrapper[4735]: I0317 03:46:10.823723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerStarted","Data":"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c"} Mar 17 03:46:11 crc kubenswrapper[4735]: I0317 03:46:11.834772 4735 generic.go:334] "Generic (PLEG): container finished" podID="d73b84d3-3133-41a0-aa99-24876242b58d" containerID="c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c" exitCode=0 Mar 17 03:46:11 crc kubenswrapper[4735]: I0317 03:46:11.834811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerDied","Data":"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c"} Mar 17 03:46:12 crc kubenswrapper[4735]: I0317 03:46:12.846621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerStarted","Data":"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26"} Mar 17 03:46:12 crc kubenswrapper[4735]: I0317 03:46:12.868032 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64bfk" podStartSLOduration=2.461908898 podStartE2EDuration="4.868008957s" podCreationTimestamp="2026-03-17 03:46:08 +0000 UTC" firstStartedPulling="2026-03-17 03:46:09.816844754 +0000 UTC m=+9395.449077732" lastFinishedPulling="2026-03-17 03:46:12.222944813 +0000 UTC m=+9397.855177791" observedRunningTime="2026-03-17 03:46:12.866301366 +0000 UTC m=+9398.498534354" watchObservedRunningTime="2026-03-17 03:46:12.868008957 +0000 UTC m=+9398.500241935" Mar 17 03:46:18 crc kubenswrapper[4735]: I0317 03:46:18.073994 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:46:18 crc kubenswrapper[4735]: E0317 03:46:18.074542 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:46:18 crc kubenswrapper[4735]: I0317 03:46:18.471947 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:18 crc kubenswrapper[4735]: I0317 03:46:18.472264 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:18 crc kubenswrapper[4735]: I0317 03:46:18.522179 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:18 crc kubenswrapper[4735]: I0317 03:46:18.947502 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:19 crc kubenswrapper[4735]: I0317 03:46:19.001671 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:20 crc kubenswrapper[4735]: I0317 03:46:20.918225 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64bfk" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="registry-server" containerID="cri-o://74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26" gracePeriod=2 Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.471529 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.527986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities\") pod \"d73b84d3-3133-41a0-aa99-24876242b58d\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.528249 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content\") pod \"d73b84d3-3133-41a0-aa99-24876242b58d\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.528341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cns\" (UniqueName: \"kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns\") pod \"d73b84d3-3133-41a0-aa99-24876242b58d\" (UID: \"d73b84d3-3133-41a0-aa99-24876242b58d\") " Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.529440 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities" (OuterVolumeSpecName: "utilities") pod "d73b84d3-3133-41a0-aa99-24876242b58d" (UID: "d73b84d3-3133-41a0-aa99-24876242b58d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.535702 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns" (OuterVolumeSpecName: "kube-api-access-m7cns") pod "d73b84d3-3133-41a0-aa99-24876242b58d" (UID: "d73b84d3-3133-41a0-aa99-24876242b58d"). InnerVolumeSpecName "kube-api-access-m7cns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.560954 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d73b84d3-3133-41a0-aa99-24876242b58d" (UID: "d73b84d3-3133-41a0-aa99-24876242b58d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.630931 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.630971 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cns\" (UniqueName: \"kubernetes.io/projected/d73b84d3-3133-41a0-aa99-24876242b58d-kube-api-access-m7cns\") on node \"crc\" DevicePath \"\"" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.630987 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d73b84d3-3133-41a0-aa99-24876242b58d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.932365 4735 generic.go:334] "Generic (PLEG): container finished" podID="d73b84d3-3133-41a0-aa99-24876242b58d" containerID="74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26" exitCode=0 Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.932430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerDied","Data":"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26"} Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.932474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bfk" event={"ID":"d73b84d3-3133-41a0-aa99-24876242b58d","Type":"ContainerDied","Data":"426a532d5b1e1669a72d841a91edd198a55233753f9c652c5596f98af55e68d4"} Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.932502 4735 scope.go:117] "RemoveContainer" containerID="74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.932696 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bfk" Mar 17 03:46:21 crc kubenswrapper[4735]: I0317 03:46:21.966186 4735 scope.go:117] "RemoveContainer" containerID="c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.013756 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.014377 4735 scope.go:117] "RemoveContainer" containerID="f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.026989 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bfk"] Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.061377 4735 scope.go:117] "RemoveContainer" containerID="74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26" Mar 17 03:46:22 crc kubenswrapper[4735]: E0317 03:46:22.061824 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26\": container with ID starting with 74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26 not found: ID does not exist" containerID="74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.061878 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26"} err="failed to get container status \"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26\": rpc error: code = NotFound desc = could not find container \"74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26\": container with ID starting with 74691a7a5b5a70ed4625460a26967427780b66353871e4ab6f2d2ecb6d4cfb26 not found: ID does not exist" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.061906 4735 scope.go:117] "RemoveContainer" containerID="c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c" Mar 17 03:46:22 crc kubenswrapper[4735]: E0317 03:46:22.062170 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c\": container with ID starting with c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c not found: ID does not exist" containerID="c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.062193 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c"} err="failed to get container status \"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c\": rpc error: code = NotFound desc = could not find container \"c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c\": container with ID starting with c1e18eab613d99351cea448e8e1f7ed4240853b9adc59f577ea75952cf12466c not found: ID does not exist" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.062224 4735 scope.go:117] "RemoveContainer" containerID="f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e" Mar 17 03:46:22 crc kubenswrapper[4735]: E0317 03:46:22.062498 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e\": container with ID starting with f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e not found: ID does not exist" containerID="f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e" Mar 17 03:46:22 crc kubenswrapper[4735]: I0317 03:46:22.062518 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e"} err="failed to get container status \"f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e\": rpc error: code = NotFound desc = could not find container \"f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e\": container with ID starting with f5f5394b18dc9c9c1a61fec291088171f106196642fd22fcb62fef9a27a4c88e not found: ID does not exist" Mar 17 03:46:23 crc kubenswrapper[4735]: I0317 03:46:23.085708 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" path="/var/lib/kubelet/pods/d73b84d3-3133-41a0-aa99-24876242b58d/volumes" Mar 17 03:46:30 crc kubenswrapper[4735]: I0317 03:46:30.073367 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:46:30 crc kubenswrapper[4735]: E0317 03:46:30.074209 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:46:44 crc kubenswrapper[4735]: I0317 03:46:44.074071 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:46:44 crc kubenswrapper[4735]: E0317 03:46:44.074883 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:46:57 crc kubenswrapper[4735]: I0317 03:46:57.073113 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:46:57 crc kubenswrapper[4735]: E0317 03:46:57.074088 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.073589 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:47:11 crc kubenswrapper[4735]: E0317 03:47:11.074430 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.667090 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:11 crc kubenswrapper[4735]: E0317 03:47:11.667721 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="extract-utilities" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.667754 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="extract-utilities" Mar 17 03:47:11 crc kubenswrapper[4735]: E0317 03:47:11.667786 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="registry-server" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.667796 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="registry-server" Mar 17 03:47:11 crc kubenswrapper[4735]: E0317 03:47:11.667830 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="extract-content" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.667840 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="extract-content" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.669329 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73b84d3-3133-41a0-aa99-24876242b58d" containerName="registry-server" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.671255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.684361 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.841704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.841917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.842008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsr89\" (UniqueName: \"kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.943165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.943272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.943316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsr89\" (UniqueName: \"kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.944258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.944465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.965171 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsr89\" (UniqueName: \"kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89\") pod \"certified-operators-7h8qd\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:11 crc kubenswrapper[4735]: I0317 03:47:11.994872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:12 crc kubenswrapper[4735]: I0317 03:47:12.507804 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:13 crc kubenswrapper[4735]: I0317 03:47:13.496132 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerID="7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93" exitCode=0 Mar 17 03:47:13 crc kubenswrapper[4735]: I0317 03:47:13.496214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerDied","Data":"7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93"} Mar 17 03:47:13 crc kubenswrapper[4735]: I0317 03:47:13.496496 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerStarted","Data":"7581b8fd12821401fc7d9c35b8ad03bf900693f22407f583054dbfc26c960f08"} Mar 17 03:47:14 crc kubenswrapper[4735]: I0317 03:47:14.506633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerStarted","Data":"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0"} Mar 17 03:47:17 crc kubenswrapper[4735]: I0317 03:47:17.611945 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerID="f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0" exitCode=0 Mar 17 03:47:17 crc kubenswrapper[4735]: I0317 03:47:17.612022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerDied","Data":"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0"} Mar 17 03:47:18 crc kubenswrapper[4735]: I0317 03:47:18.627391 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerStarted","Data":"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0"} Mar 17 03:47:18 crc kubenswrapper[4735]: I0317 03:47:18.660774 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7h8qd" podStartSLOduration=3.123888594 podStartE2EDuration="7.660752145s" podCreationTimestamp="2026-03-17 03:47:11 +0000 UTC" firstStartedPulling="2026-03-17 03:47:13.498372773 +0000 UTC m=+9459.130605751" lastFinishedPulling="2026-03-17 03:47:18.035236324 +0000 UTC m=+9463.667469302" observedRunningTime="2026-03-17 03:47:18.656791679 +0000 UTC m=+9464.289024667" watchObservedRunningTime="2026-03-17 03:47:18.660752145 +0000 UTC m=+9464.292985133" Mar 17 03:47:21 crc kubenswrapper[4735]: I0317 03:47:21.996166 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:21 crc kubenswrapper[4735]: I0317 03:47:21.996477 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:23 crc kubenswrapper[4735]: I0317 03:47:23.051948 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7h8qd" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:47:23 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:47:23 crc kubenswrapper[4735]: > Mar 17 03:47:23 crc kubenswrapper[4735]: I0317 03:47:23.073582 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:47:23 crc kubenswrapper[4735]: E0317 03:47:23.073945 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:47:32 crc kubenswrapper[4735]: I0317 03:47:32.060104 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:32 crc kubenswrapper[4735]: I0317 03:47:32.121335 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:32 crc kubenswrapper[4735]: I0317 03:47:32.322194 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:33 crc kubenswrapper[4735]: I0317 03:47:33.810441 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7h8qd" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="registry-server" containerID="cri-o://d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0" gracePeriod=2 Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.522809 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.623825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities\") pod \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.624062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content\") pod \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.624806 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities" (OuterVolumeSpecName: "utilities") pod "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" (UID: "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.667311 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" (UID: "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.725763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsr89\" (UniqueName: \"kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89\") pod \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\" (UID: \"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a\") " Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.726290 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.726313 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.741156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89" (OuterVolumeSpecName: "kube-api-access-nsr89") pod "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" (UID: "c4ec6f2d-e03f-4b32-9e2e-919ea6de180a"). InnerVolumeSpecName "kube-api-access-nsr89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.821340 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerID="d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0" exitCode=0 Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.821402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerDied","Data":"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0"} Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.821433 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7h8qd" event={"ID":"c4ec6f2d-e03f-4b32-9e2e-919ea6de180a","Type":"ContainerDied","Data":"7581b8fd12821401fc7d9c35b8ad03bf900693f22407f583054dbfc26c960f08"} Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.821453 4735 scope.go:117] "RemoveContainer" containerID="d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.821603 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7h8qd" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.828787 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsr89\" (UniqueName: \"kubernetes.io/projected/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a-kube-api-access-nsr89\") on node \"crc\" DevicePath \"\"" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.847071 4735 scope.go:117] "RemoveContainer" containerID="f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.903232 4735 scope.go:117] "RemoveContainer" containerID="7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.913113 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.923117 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7h8qd"] Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.944384 4735 scope.go:117] "RemoveContainer" containerID="d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0" Mar 17 03:47:34 crc kubenswrapper[4735]: E0317 03:47:34.944757 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0\": container with ID starting with d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0 not found: ID does not exist" containerID="d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.944798 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0"} err="failed to get container status \"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0\": rpc error: code = NotFound desc = could not find container \"d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0\": container with ID starting with d77563375281225358f857beb459335bf9f702742fe485a22fcf29266a89f3e0 not found: ID does not exist" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.944827 4735 scope.go:117] "RemoveContainer" containerID="f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0" Mar 17 03:47:34 crc kubenswrapper[4735]: E0317 03:47:34.945238 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0\": container with ID starting with f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0 not found: ID does not exist" containerID="f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.945258 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0"} err="failed to get container status \"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0\": rpc error: code = NotFound desc = could not find container \"f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0\": container with ID starting with f27934b24a7915622251169d4af62ba6fa85d5cc486b042aeb86dcb75483b7c0 not found: ID does not exist" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.945273 4735 scope.go:117] "RemoveContainer" containerID="7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93" Mar 17 03:47:34 crc kubenswrapper[4735]: E0317 03:47:34.945441 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93\": container with ID starting with 7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93 not found: ID does not exist" containerID="7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93" Mar 17 03:47:34 crc kubenswrapper[4735]: I0317 03:47:34.945464 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93"} err="failed to get container status \"7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93\": rpc error: code = NotFound desc = could not find container \"7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93\": container with ID starting with 7cd2bb34b09592d786445cb2bd765d6471bd3099d16ac16709ef24de828a4d93 not found: ID does not exist" Mar 17 03:47:35 crc kubenswrapper[4735]: I0317 03:47:35.083257 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" path="/var/lib/kubelet/pods/c4ec6f2d-e03f-4b32-9e2e-919ea6de180a/volumes" Mar 17 03:47:37 crc kubenswrapper[4735]: I0317 03:47:37.075512 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:47:37 crc kubenswrapper[4735]: E0317 03:47:37.076176 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:47:52 crc kubenswrapper[4735]: I0317 03:47:52.072958 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:47:52 crc kubenswrapper[4735]: E0317 03:47:52.073684 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.178597 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561988-dxklv"] Mar 17 03:48:00 crc kubenswrapper[4735]: E0317 03:48:00.179797 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="extract-content" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.179819 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="extract-content" Mar 17 03:48:00 crc kubenswrapper[4735]: E0317 03:48:00.179895 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="registry-server" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.179910 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="registry-server" Mar 17 03:48:00 crc kubenswrapper[4735]: E0317 03:48:00.179948 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="extract-utilities" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.179960 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="extract-utilities" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.180252 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ec6f2d-e03f-4b32-9e2e-919ea6de180a" containerName="registry-server" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.181048 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.184147 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.188482 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.189361 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.196169 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561988-dxklv"] Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.299068 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcxjm\" (UniqueName: \"kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm\") pod \"auto-csr-approver-29561988-dxklv\" (UID: \"0c7c2d5f-ea6e-4d98-bacc-000724b141b7\") " pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.401931 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcxjm\" (UniqueName: \"kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm\") pod \"auto-csr-approver-29561988-dxklv\" (UID: \"0c7c2d5f-ea6e-4d98-bacc-000724b141b7\") " pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.423424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcxjm\" (UniqueName: \"kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm\") pod \"auto-csr-approver-29561988-dxklv\" (UID: \"0c7c2d5f-ea6e-4d98-bacc-000724b141b7\") " pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:00 crc kubenswrapper[4735]: I0317 03:48:00.513807 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:01 crc kubenswrapper[4735]: I0317 03:48:01.090265 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561988-dxklv"] Mar 17 03:48:01 crc kubenswrapper[4735]: W0317 03:48:01.099904 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c7c2d5f_ea6e_4d98_bacc_000724b141b7.slice/crio-bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918 WatchSource:0}: Error finding container bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918: Status 404 returned error can't find the container with id bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918 Mar 17 03:48:01 crc kubenswrapper[4735]: I0317 03:48:01.121348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561988-dxklv" event={"ID":"0c7c2d5f-ea6e-4d98-bacc-000724b141b7","Type":"ContainerStarted","Data":"bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918"} Mar 17 03:48:03 crc kubenswrapper[4735]: I0317 03:48:03.175773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561988-dxklv" event={"ID":"0c7c2d5f-ea6e-4d98-bacc-000724b141b7","Type":"ContainerStarted","Data":"a329352668f63fd09723f7cfd5e72a65057333597253dde8949797362c8e8ff9"} Mar 17 03:48:03 crc kubenswrapper[4735]: I0317 03:48:03.206342 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561988-dxklv" podStartSLOduration=2.107476512 podStartE2EDuration="3.206323508s" podCreationTimestamp="2026-03-17 03:48:00 +0000 UTC" firstStartedPulling="2026-03-17 03:48:01.103069811 +0000 UTC m=+9506.735302789" lastFinishedPulling="2026-03-17 03:48:02.201916797 +0000 UTC m=+9507.834149785" observedRunningTime="2026-03-17 03:48:03.195339103 +0000 UTC m=+9508.827572081" watchObservedRunningTime="2026-03-17 03:48:03.206323508 +0000 UTC m=+9508.838556486" Mar 17 03:48:04 crc kubenswrapper[4735]: I0317 03:48:04.188876 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c7c2d5f-ea6e-4d98-bacc-000724b141b7" containerID="a329352668f63fd09723f7cfd5e72a65057333597253dde8949797362c8e8ff9" exitCode=0 Mar 17 03:48:04 crc kubenswrapper[4735]: I0317 03:48:04.188925 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561988-dxklv" event={"ID":"0c7c2d5f-ea6e-4d98-bacc-000724b141b7","Type":"ContainerDied","Data":"a329352668f63fd09723f7cfd5e72a65057333597253dde8949797362c8e8ff9"} Mar 17 03:48:05 crc kubenswrapper[4735]: I0317 03:48:05.083120 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:48:05 crc kubenswrapper[4735]: E0317 03:48:05.084199 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:48:05 crc kubenswrapper[4735]: I0317 03:48:05.580241 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:05 crc kubenswrapper[4735]: I0317 03:48:05.723354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcxjm\" (UniqueName: \"kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm\") pod \"0c7c2d5f-ea6e-4d98-bacc-000724b141b7\" (UID: \"0c7c2d5f-ea6e-4d98-bacc-000724b141b7\") " Mar 17 03:48:05 crc kubenswrapper[4735]: I0317 03:48:05.729415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm" (OuterVolumeSpecName: "kube-api-access-gcxjm") pod "0c7c2d5f-ea6e-4d98-bacc-000724b141b7" (UID: "0c7c2d5f-ea6e-4d98-bacc-000724b141b7"). InnerVolumeSpecName "kube-api-access-gcxjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:48:05 crc kubenswrapper[4735]: I0317 03:48:05.825700 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcxjm\" (UniqueName: \"kubernetes.io/projected/0c7c2d5f-ea6e-4d98-bacc-000724b141b7-kube-api-access-gcxjm\") on node \"crc\" DevicePath \"\"" Mar 17 03:48:06 crc kubenswrapper[4735]: I0317 03:48:06.214453 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561988-dxklv" event={"ID":"0c7c2d5f-ea6e-4d98-bacc-000724b141b7","Type":"ContainerDied","Data":"bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918"} Mar 17 03:48:06 crc kubenswrapper[4735]: I0317 03:48:06.214510 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec3514ce7e990be7e046259be48615a1ed47087ef98a1e4b910b6cf743b5918" Mar 17 03:48:06 crc kubenswrapper[4735]: I0317 03:48:06.214587 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561988-dxklv" Mar 17 03:48:06 crc kubenswrapper[4735]: I0317 03:48:06.273985 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561982-nrdtt"] Mar 17 03:48:06 crc kubenswrapper[4735]: I0317 03:48:06.280965 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561982-nrdtt"] Mar 17 03:48:07 crc kubenswrapper[4735]: I0317 03:48:07.086838 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75fba438-839c-408f-904e-5d7fc8c24a36" path="/var/lib/kubelet/pods/75fba438-839c-408f-904e-5d7fc8c24a36/volumes" Mar 17 03:48:08 crc kubenswrapper[4735]: I0317 03:48:08.719341 4735 scope.go:117] "RemoveContainer" containerID="62fe57647da1635ee9153850c5a40c50f18b7d9d480965b0c222281fecd9b38a" Mar 17 03:48:20 crc kubenswrapper[4735]: I0317 03:48:20.073712 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:48:20 crc kubenswrapper[4735]: E0317 03:48:20.074752 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:48:35 crc kubenswrapper[4735]: I0317 03:48:35.083175 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:48:35 crc kubenswrapper[4735]: E0317 03:48:35.085186 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:48:50 crc kubenswrapper[4735]: I0317 03:48:50.073046 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:48:50 crc kubenswrapper[4735]: E0317 03:48:50.073659 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:01 crc kubenswrapper[4735]: I0317 03:49:01.072799 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:49:01 crc kubenswrapper[4735]: E0317 03:49:01.074524 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:15 crc kubenswrapper[4735]: I0317 03:49:15.085248 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:49:15 crc kubenswrapper[4735]: E0317 03:49:15.086298 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.084563 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ddf9f84d9-bh5g9"] Mar 17 03:49:27 crc kubenswrapper[4735]: E0317 03:49:27.085430 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7c2d5f-ea6e-4d98-bacc-000724b141b7" containerName="oc" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.085447 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7c2d5f-ea6e-4d98-bacc-000724b141b7" containerName="oc" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.085696 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7c2d5f-ea6e-4d98-bacc-000724b141b7" containerName="oc" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.087011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.121426 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ddf9f84d9-bh5g9"] Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.177393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-httpd-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.177445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-combined-ca-bundle\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.177495 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.177803 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-internal-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.177876 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnz64\" (UniqueName: \"kubernetes.io/projected/c6f23de1-8a7f-4f5f-986a-df17ee65d752-kube-api-access-bnz64\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.178046 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-ovndb-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.178247 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-public-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-ovndb-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-public-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281343 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-httpd-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-combined-ca-bundle\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-internal-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.281611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnz64\" (UniqueName: \"kubernetes.io/projected/c6f23de1-8a7f-4f5f-986a-df17ee65d752-kube-api-access-bnz64\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.287598 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-ovndb-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.288258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.288408 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-httpd-config\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.289895 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-public-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.296800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-combined-ca-bundle\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.298087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6f23de1-8a7f-4f5f-986a-df17ee65d752-internal-tls-certs\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.309459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnz64\" (UniqueName: \"kubernetes.io/projected/c6f23de1-8a7f-4f5f-986a-df17ee65d752-kube-api-access-bnz64\") pod \"neutron-ddf9f84d9-bh5g9\" (UID: \"c6f23de1-8a7f-4f5f-986a-df17ee65d752\") " pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:27 crc kubenswrapper[4735]: I0317 03:49:27.406496 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:28 crc kubenswrapper[4735]: I0317 03:49:28.073621 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:49:28 crc kubenswrapper[4735]: E0317 03:49:28.074189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:28 crc kubenswrapper[4735]: I0317 03:49:28.499840 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ddf9f84d9-bh5g9"] Mar 17 03:49:29 crc kubenswrapper[4735]: I0317 03:49:29.087209 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf9f84d9-bh5g9" event={"ID":"c6f23de1-8a7f-4f5f-986a-df17ee65d752","Type":"ContainerStarted","Data":"26483269432f59c056d5a75fb43ffd417d338b1bca035ac21a4d21fe4740fab9"} Mar 17 03:49:29 crc kubenswrapper[4735]: I0317 03:49:29.088366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf9f84d9-bh5g9" event={"ID":"c6f23de1-8a7f-4f5f-986a-df17ee65d752","Type":"ContainerStarted","Data":"6b13186daacd4bc21abce22d3c70a5cb0e740028000825f936063f0ffde6db6c"} Mar 17 03:49:29 crc kubenswrapper[4735]: I0317 03:49:29.088380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf9f84d9-bh5g9" event={"ID":"c6f23de1-8a7f-4f5f-986a-df17ee65d752","Type":"ContainerStarted","Data":"d1d4467b5732d38035b11d48289dfb3a78c98b413d788a547cb25c4ca59263d9"} Mar 17 03:49:29 crc kubenswrapper[4735]: I0317 03:49:29.088393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:29 crc kubenswrapper[4735]: I0317 03:49:29.117427 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ddf9f84d9-bh5g9" podStartSLOduration=2.117408996 podStartE2EDuration="2.117408996s" podCreationTimestamp="2026-03-17 03:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 03:49:29.109980897 +0000 UTC m=+9594.742213875" watchObservedRunningTime="2026-03-17 03:49:29.117408996 +0000 UTC m=+9594.749641974" Mar 17 03:49:40 crc kubenswrapper[4735]: I0317 03:49:40.074005 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:49:40 crc kubenswrapper[4735]: E0317 03:49:40.074840 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:51 crc kubenswrapper[4735]: I0317 03:49:51.073564 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:49:51 crc kubenswrapper[4735]: E0317 03:49:51.074372 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:49:57 crc kubenswrapper[4735]: I0317 03:49:57.423040 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ddf9f84d9-bh5g9" Mar 17 03:49:57 crc kubenswrapper[4735]: I0317 03:49:57.537597 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 03:49:57 crc kubenswrapper[4735]: I0317 03:49:57.537880 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d9c87887c-nzq2p" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-api" containerID="cri-o://d8ad7c5e0c4a91a6804e23bf06a1cf430c578a70e39fa2d6b7363239a4740ea6" gracePeriod=30 Mar 17 03:49:57 crc kubenswrapper[4735]: I0317 03:49:57.538326 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d9c87887c-nzq2p" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-httpd" containerID="cri-o://f22530e71aeaeea7659cae58e7701b91cb653ec4667b28a75942ea9e75216442" gracePeriod=30 Mar 17 03:49:58 crc kubenswrapper[4735]: I0317 03:49:58.359426 4735 generic.go:334] "Generic (PLEG): container finished" podID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerID="f22530e71aeaeea7659cae58e7701b91cb653ec4667b28a75942ea9e75216442" exitCode=0 Mar 17 03:49:58 crc kubenswrapper[4735]: I0317 03:49:58.359721 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerDied","Data":"f22530e71aeaeea7659cae58e7701b91cb653ec4667b28a75942ea9e75216442"} Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.170885 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561990-dq5bb"] Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.173612 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.177745 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.178353 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.189331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.193369 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561990-dq5bb"] Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.299392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dwmn\" (UniqueName: \"kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn\") pod \"auto-csr-approver-29561990-dq5bb\" (UID: \"b5f03e13-4ca9-432d-b6dc-cb707d75556b\") " pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.401568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dwmn\" (UniqueName: \"kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn\") pod \"auto-csr-approver-29561990-dq5bb\" (UID: \"b5f03e13-4ca9-432d-b6dc-cb707d75556b\") " pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.424843 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dwmn\" (UniqueName: \"kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn\") pod \"auto-csr-approver-29561990-dq5bb\" (UID: \"b5f03e13-4ca9-432d-b6dc-cb707d75556b\") " pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:00 crc kubenswrapper[4735]: I0317 03:50:00.504212 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:01 crc kubenswrapper[4735]: I0317 03:50:01.365889 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561990-dq5bb"] Mar 17 03:50:01 crc kubenswrapper[4735]: I0317 03:50:01.390338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" event={"ID":"b5f03e13-4ca9-432d-b6dc-cb707d75556b","Type":"ContainerStarted","Data":"a1db169e8af81c2e0fd02e6d20f295f0fe180ac438621080e1e1c60221ed903d"} Mar 17 03:50:01 crc kubenswrapper[4735]: I0317 03:50:01.396335 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:50:02 crc kubenswrapper[4735]: I0317 03:50:02.757986 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d9c87887c-nzq2p" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.71:9696/\": dial tcp 10.217.1.71:9696: connect: connection refused" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.413844 4735 generic.go:334] "Generic (PLEG): container finished" podID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerID="d8ad7c5e0c4a91a6804e23bf06a1cf430c578a70e39fa2d6b7363239a4740ea6" exitCode=0 Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.413928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerDied","Data":"d8ad7c5e0c4a91a6804e23bf06a1cf430c578a70e39fa2d6b7363239a4740ea6"} Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.415634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" event={"ID":"b5f03e13-4ca9-432d-b6dc-cb707d75556b","Type":"ContainerStarted","Data":"a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d"} Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.432671 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" podStartSLOduration=2.199299654 podStartE2EDuration="3.432651362s" podCreationTimestamp="2026-03-17 03:50:00 +0000 UTC" firstStartedPulling="2026-03-17 03:50:01.384381565 +0000 UTC m=+9627.016614543" lastFinishedPulling="2026-03-17 03:50:02.617733263 +0000 UTC m=+9628.249966251" observedRunningTime="2026-03-17 03:50:03.4325447 +0000 UTC m=+9629.064777678" watchObservedRunningTime="2026-03-17 03:50:03.432651362 +0000 UTC m=+9629.064884350" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.742723 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.904928 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905001 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905252 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t29ph\" (UniqueName: \"kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.905417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.914260 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.916137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph" (OuterVolumeSpecName: "kube-api-access-t29ph") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "kube-api-access-t29ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.979554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config" (OuterVolumeSpecName: "config") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:03 crc kubenswrapper[4735]: I0317 03:50:03.981788 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.001851 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.006162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.008541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") pod \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\" (UID: \"85ecf5a4-2bd8-444d-9008-c2cec6d57362\") " Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.009159 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.009672 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-config\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.009784 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.009905 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.010001 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t29ph\" (UniqueName: \"kubernetes.io/projected/85ecf5a4-2bd8-444d-9008-c2cec6d57362-kube-api-access-t29ph\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: W0317 03:50:04.011105 4735 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/85ecf5a4-2bd8-444d-9008-c2cec6d57362/volumes/kubernetes.io~secret/internal-tls-certs Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.011148 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.072873 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:50:04 crc kubenswrapper[4735]: E0317 03:50:04.073449 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.078154 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "85ecf5a4-2bd8-444d-9008-c2cec6d57362" (UID: "85ecf5a4-2bd8-444d-9008-c2cec6d57362"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.111500 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.111528 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ecf5a4-2bd8-444d-9008-c2cec6d57362-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.434893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9c87887c-nzq2p" event={"ID":"85ecf5a4-2bd8-444d-9008-c2cec6d57362","Type":"ContainerDied","Data":"e0be22e99164a5922fc5f78a685d2f4c4a0922a2d0e1622e7b2d1938b2c5ee80"} Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.434917 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9c87887c-nzq2p" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.434944 4735 scope.go:117] "RemoveContainer" containerID="f22530e71aeaeea7659cae58e7701b91cb653ec4667b28a75942ea9e75216442" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.446313 4735 generic.go:334] "Generic (PLEG): container finished" podID="b5f03e13-4ca9-432d-b6dc-cb707d75556b" containerID="a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d" exitCode=0 Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.446354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" event={"ID":"b5f03e13-4ca9-432d-b6dc-cb707d75556b","Type":"ContainerDied","Data":"a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d"} Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.487004 4735 scope.go:117] "RemoveContainer" containerID="d8ad7c5e0c4a91a6804e23bf06a1cf430c578a70e39fa2d6b7363239a4740ea6" Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.493075 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 03:50:04 crc kubenswrapper[4735]: I0317 03:50:04.502402 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d9c87887c-nzq2p"] Mar 17 03:50:04 crc kubenswrapper[4735]: E0317 03:50:04.520068 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f03e13_4ca9_432d_b6dc_cb707d75556b.slice/crio-conmon-a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f03e13_4ca9_432d_b6dc_cb707d75556b.slice/crio-a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d.scope\": RecentStats: unable to find data in memory cache]" Mar 17 03:50:05 crc kubenswrapper[4735]: I0317 03:50:05.094493 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" path="/var/lib/kubelet/pods/85ecf5a4-2bd8-444d-9008-c2cec6d57362/volumes" Mar 17 03:50:05 crc kubenswrapper[4735]: I0317 03:50:05.869941 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.049805 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dwmn\" (UniqueName: \"kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn\") pod \"b5f03e13-4ca9-432d-b6dc-cb707d75556b\" (UID: \"b5f03e13-4ca9-432d-b6dc-cb707d75556b\") " Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.061148 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn" (OuterVolumeSpecName: "kube-api-access-9dwmn") pod "b5f03e13-4ca9-432d-b6dc-cb707d75556b" (UID: "b5f03e13-4ca9-432d-b6dc-cb707d75556b"). InnerVolumeSpecName "kube-api-access-9dwmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.151445 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dwmn\" (UniqueName: \"kubernetes.io/projected/b5f03e13-4ca9-432d-b6dc-cb707d75556b-kube-api-access-9dwmn\") on node \"crc\" DevicePath \"\"" Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.493446 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" event={"ID":"b5f03e13-4ca9-432d-b6dc-cb707d75556b","Type":"ContainerDied","Data":"a1db169e8af81c2e0fd02e6d20f295f0fe180ac438621080e1e1c60221ed903d"} Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.493512 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1db169e8af81c2e0fd02e6d20f295f0fe180ac438621080e1e1c60221ed903d" Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.493628 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561990-dq5bb" Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.538742 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561984-8rqhm"] Mar 17 03:50:06 crc kubenswrapper[4735]: I0317 03:50:06.550561 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561984-8rqhm"] Mar 17 03:50:07 crc kubenswrapper[4735]: I0317 03:50:07.087126 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713eefe9-a95c-4145-8b76-a6f05c7ff4f0" path="/var/lib/kubelet/pods/713eefe9-a95c-4145-8b76-a6f05c7ff4f0/volumes" Mar 17 03:50:08 crc kubenswrapper[4735]: I0317 03:50:08.862191 4735 scope.go:117] "RemoveContainer" containerID="b7771e7521699c4f1384364a227b1605bd78b86ef4280a8d139b67395fb2154d" Mar 17 03:50:15 crc kubenswrapper[4735]: I0317 03:50:15.078801 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:50:15 crc kubenswrapper[4735]: I0317 03:50:15.583083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553"} Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.190692 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561992-pppsp"] Mar 17 03:52:00 crc kubenswrapper[4735]: E0317 03:52:00.193533 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-api" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.193573 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-api" Mar 17 03:52:00 crc kubenswrapper[4735]: E0317 03:52:00.193648 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-httpd" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.193662 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-httpd" Mar 17 03:52:00 crc kubenswrapper[4735]: E0317 03:52:00.193688 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f03e13-4ca9-432d-b6dc-cb707d75556b" containerName="oc" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.193702 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f03e13-4ca9-432d-b6dc-cb707d75556b" containerName="oc" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.194262 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f03e13-4ca9-432d-b6dc-cb707d75556b" containerName="oc" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.194679 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-httpd" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.194714 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ecf5a4-2bd8-444d-9008-c2cec6d57362" containerName="neutron-api" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.198498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.201147 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.205507 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561992-pppsp"] Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.209792 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.210317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.288182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltxdm\" (UniqueName: \"kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm\") pod \"auto-csr-approver-29561992-pppsp\" (UID: \"3263becc-fa75-443e-a544-9b021916eaf5\") " pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.390185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxdm\" (UniqueName: \"kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm\") pod \"auto-csr-approver-29561992-pppsp\" (UID: \"3263becc-fa75-443e-a544-9b021916eaf5\") " pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.423152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxdm\" (UniqueName: \"kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm\") pod \"auto-csr-approver-29561992-pppsp\" (UID: \"3263becc-fa75-443e-a544-9b021916eaf5\") " pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:00 crc kubenswrapper[4735]: I0317 03:52:00.529870 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:01 crc kubenswrapper[4735]: I0317 03:52:01.649568 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561992-pppsp"] Mar 17 03:52:01 crc kubenswrapper[4735]: W0317 03:52:01.692657 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3263becc_fa75_443e_a544_9b021916eaf5.slice/crio-b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380 WatchSource:0}: Error finding container b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380: Status 404 returned error can't find the container with id b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380 Mar 17 03:52:02 crc kubenswrapper[4735]: I0317 03:52:02.720492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561992-pppsp" event={"ID":"3263becc-fa75-443e-a544-9b021916eaf5","Type":"ContainerStarted","Data":"b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380"} Mar 17 03:52:03 crc kubenswrapper[4735]: I0317 03:52:03.732635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561992-pppsp" event={"ID":"3263becc-fa75-443e-a544-9b021916eaf5","Type":"ContainerStarted","Data":"94fd84b18f09ba4e3d766483b532d464a2e7fafd4bf5cb978ef89abd40adf66c"} Mar 17 03:52:03 crc kubenswrapper[4735]: I0317 03:52:03.754811 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561992-pppsp" podStartSLOduration=2.740340008 podStartE2EDuration="3.753571273s" podCreationTimestamp="2026-03-17 03:52:00 +0000 UTC" firstStartedPulling="2026-03-17 03:52:01.706580457 +0000 UTC m=+9747.338813435" lastFinishedPulling="2026-03-17 03:52:02.719811702 +0000 UTC m=+9748.352044700" observedRunningTime="2026-03-17 03:52:03.752023196 +0000 UTC m=+9749.384256214" watchObservedRunningTime="2026-03-17 03:52:03.753571273 +0000 UTC m=+9749.385804271" Mar 17 03:52:05 crc kubenswrapper[4735]: I0317 03:52:05.758751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561992-pppsp" event={"ID":"3263becc-fa75-443e-a544-9b021916eaf5","Type":"ContainerDied","Data":"94fd84b18f09ba4e3d766483b532d464a2e7fafd4bf5cb978ef89abd40adf66c"} Mar 17 03:52:05 crc kubenswrapper[4735]: I0317 03:52:05.760748 4735 generic.go:334] "Generic (PLEG): container finished" podID="3263becc-fa75-443e-a544-9b021916eaf5" containerID="94fd84b18f09ba4e3d766483b532d464a2e7fafd4bf5cb978ef89abd40adf66c" exitCode=0 Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.258932 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.338475 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltxdm\" (UniqueName: \"kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm\") pod \"3263becc-fa75-443e-a544-9b021916eaf5\" (UID: \"3263becc-fa75-443e-a544-9b021916eaf5\") " Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.370833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm" (OuterVolumeSpecName: "kube-api-access-ltxdm") pod "3263becc-fa75-443e-a544-9b021916eaf5" (UID: "3263becc-fa75-443e-a544-9b021916eaf5"). InnerVolumeSpecName "kube-api-access-ltxdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.441880 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltxdm\" (UniqueName: \"kubernetes.io/projected/3263becc-fa75-443e-a544-9b021916eaf5-kube-api-access-ltxdm\") on node \"crc\" DevicePath \"\"" Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.781657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561992-pppsp" event={"ID":"3263becc-fa75-443e-a544-9b021916eaf5","Type":"ContainerDied","Data":"b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380"} Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.781744 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561992-pppsp" Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.782682 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17937c3287ac5b8801dede6996d810dd3010a1edbbe4e0cff29085406882380" Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.858949 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561986-jsp2s"] Mar 17 03:52:07 crc kubenswrapper[4735]: I0317 03:52:07.869149 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561986-jsp2s"] Mar 17 03:52:09 crc kubenswrapper[4735]: I0317 03:52:09.093253 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26493eba-11d9-47d0-9eaa-079f1d3451c1" path="/var/lib/kubelet/pods/26493eba-11d9-47d0-9eaa-079f1d3451c1/volumes" Mar 17 03:52:09 crc kubenswrapper[4735]: I0317 03:52:09.149150 4735 scope.go:117] "RemoveContainer" containerID="c44f11870b846c086e7b8a0331956b74057e7f1930707791074d110de671bb51" Mar 17 03:52:42 crc kubenswrapper[4735]: I0317 03:52:42.607713 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:52:42 crc kubenswrapper[4735]: I0317 03:52:42.610371 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:53:12 crc kubenswrapper[4735]: I0317 03:53:12.606340 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:53:12 crc kubenswrapper[4735]: I0317 03:53:12.607199 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.286696 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:53:24 crc kubenswrapper[4735]: E0317 03:53:24.287811 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3263becc-fa75-443e-a544-9b021916eaf5" containerName="oc" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.287830 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3263becc-fa75-443e-a544-9b021916eaf5" containerName="oc" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.288206 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3263becc-fa75-443e-a544-9b021916eaf5" containerName="oc" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.291393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.322817 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.410963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.411117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgf9\" (UniqueName: \"kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.411153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.512809 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgf9\" (UniqueName: \"kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.512887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.512983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.513710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.513718 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.532533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgf9\" (UniqueName: \"kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9\") pod \"redhat-operators-fgptd\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:24 crc kubenswrapper[4735]: I0317 03:53:24.611305 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:25 crc kubenswrapper[4735]: I0317 03:53:25.093557 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:53:25 crc kubenswrapper[4735]: I0317 03:53:25.638582 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerID="09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3" exitCode=0 Mar 17 03:53:25 crc kubenswrapper[4735]: I0317 03:53:25.638864 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerDied","Data":"09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3"} Mar 17 03:53:25 crc kubenswrapper[4735]: I0317 03:53:25.638891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerStarted","Data":"bcaa63e182359aadd34ecfcfa56ce9d9bab2bdf6659ffe2c5e4c2311bb2623c5"} Mar 17 03:53:26 crc kubenswrapper[4735]: I0317 03:53:26.650073 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerStarted","Data":"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c"} Mar 17 03:53:31 crc kubenswrapper[4735]: I0317 03:53:31.699963 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerID="44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c" exitCode=0 Mar 17 03:53:31 crc kubenswrapper[4735]: I0317 03:53:31.700032 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerDied","Data":"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c"} Mar 17 03:53:33 crc kubenswrapper[4735]: I0317 03:53:33.737254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerStarted","Data":"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7"} Mar 17 03:53:34 crc kubenswrapper[4735]: I0317 03:53:34.612848 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:34 crc kubenswrapper[4735]: I0317 03:53:34.613291 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:53:35 crc kubenswrapper[4735]: I0317 03:53:35.662906 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fgptd" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" probeResult="failure" output=< Mar 17 03:53:35 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:53:35 crc kubenswrapper[4735]: > Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.606550 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.607044 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.607089 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.608515 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.608969 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553" gracePeriod=600 Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.823921 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553" exitCode=0 Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.823970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553"} Mar 17 03:53:42 crc kubenswrapper[4735]: I0317 03:53:42.824057 4735 scope.go:117] "RemoveContainer" containerID="0b99c490b7478cc714abfda56ccc18465262954585c3ef6e3ae36c41fd4a6221" Mar 17 03:53:43 crc kubenswrapper[4735]: I0317 03:53:43.834715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487"} Mar 17 03:53:43 crc kubenswrapper[4735]: I0317 03:53:43.865796 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fgptd" podStartSLOduration=13.30803248 podStartE2EDuration="19.865777617s" podCreationTimestamp="2026-03-17 03:53:24 +0000 UTC" firstStartedPulling="2026-03-17 03:53:25.639838149 +0000 UTC m=+9831.272071137" lastFinishedPulling="2026-03-17 03:53:32.197583296 +0000 UTC m=+9837.829816274" observedRunningTime="2026-03-17 03:53:33.7679036 +0000 UTC m=+9839.400136588" watchObservedRunningTime="2026-03-17 03:53:43.865777617 +0000 UTC m=+9849.498010595" Mar 17 03:53:46 crc kubenswrapper[4735]: I0317 03:53:46.173320 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fgptd" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" probeResult="failure" output=< Mar 17 03:53:46 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:53:46 crc kubenswrapper[4735]: > Mar 17 03:53:55 crc kubenswrapper[4735]: I0317 03:53:55.687564 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fgptd" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" probeResult="failure" output=< Mar 17 03:53:55 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:53:55 crc kubenswrapper[4735]: > Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.184324 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561994-pbvfk"] Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.192797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.194966 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561994-pbvfk"] Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.203934 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.203973 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.203948 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.366608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjzq\" (UniqueName: \"kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq\") pod \"auto-csr-approver-29561994-pbvfk\" (UID: \"833bc2e6-f71c-4b83-bf69-d37773c5b08a\") " pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.468731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjzq\" (UniqueName: \"kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq\") pod \"auto-csr-approver-29561994-pbvfk\" (UID: \"833bc2e6-f71c-4b83-bf69-d37773c5b08a\") " pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.503798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjzq\" (UniqueName: \"kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq\") pod \"auto-csr-approver-29561994-pbvfk\" (UID: \"833bc2e6-f71c-4b83-bf69-d37773c5b08a\") " pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:00 crc kubenswrapper[4735]: I0317 03:54:00.515100 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:01 crc kubenswrapper[4735]: I0317 03:54:01.560934 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561994-pbvfk"] Mar 17 03:54:02 crc kubenswrapper[4735]: I0317 03:54:02.023831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" event={"ID":"833bc2e6-f71c-4b83-bf69-d37773c5b08a","Type":"ContainerStarted","Data":"75f13baefbc65bc0be2f0dfd1837bc23f3e78db6b533da4c85edfe2cd1c22de9"} Mar 17 03:54:04 crc kubenswrapper[4735]: I0317 03:54:04.046549 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" event={"ID":"833bc2e6-f71c-4b83-bf69-d37773c5b08a","Type":"ContainerStarted","Data":"be14bef69cd86649d1967a901ecc94b28d9c9208aa6523ee5d2210fad037b9f9"} Mar 17 03:54:04 crc kubenswrapper[4735]: I0317 03:54:04.070939 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" podStartSLOduration=3.095256097 podStartE2EDuration="4.070908368s" podCreationTimestamp="2026-03-17 03:54:00 +0000 UTC" firstStartedPulling="2026-03-17 03:54:01.577666263 +0000 UTC m=+9867.209899251" lastFinishedPulling="2026-03-17 03:54:02.553318544 +0000 UTC m=+9868.185551522" observedRunningTime="2026-03-17 03:54:04.062130757 +0000 UTC m=+9869.694363755" watchObservedRunningTime="2026-03-17 03:54:04.070908368 +0000 UTC m=+9869.703141376" Mar 17 03:54:05 crc kubenswrapper[4735]: I0317 03:54:05.057896 4735 generic.go:334] "Generic (PLEG): container finished" podID="833bc2e6-f71c-4b83-bf69-d37773c5b08a" containerID="be14bef69cd86649d1967a901ecc94b28d9c9208aa6523ee5d2210fad037b9f9" exitCode=0 Mar 17 03:54:05 crc kubenswrapper[4735]: I0317 03:54:05.058232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" event={"ID":"833bc2e6-f71c-4b83-bf69-d37773c5b08a","Type":"ContainerDied","Data":"be14bef69cd86649d1967a901ecc94b28d9c9208aa6523ee5d2210fad037b9f9"} Mar 17 03:54:05 crc kubenswrapper[4735]: I0317 03:54:05.688259 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fgptd" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" probeResult="failure" output=< Mar 17 03:54:05 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:54:05 crc kubenswrapper[4735]: > Mar 17 03:54:06 crc kubenswrapper[4735]: I0317 03:54:06.580819 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:06 crc kubenswrapper[4735]: I0317 03:54:06.689606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqjzq\" (UniqueName: \"kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq\") pod \"833bc2e6-f71c-4b83-bf69-d37773c5b08a\" (UID: \"833bc2e6-f71c-4b83-bf69-d37773c5b08a\") " Mar 17 03:54:06 crc kubenswrapper[4735]: I0317 03:54:06.712348 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq" (OuterVolumeSpecName: "kube-api-access-lqjzq") pod "833bc2e6-f71c-4b83-bf69-d37773c5b08a" (UID: "833bc2e6-f71c-4b83-bf69-d37773c5b08a"). InnerVolumeSpecName "kube-api-access-lqjzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:54:06 crc kubenswrapper[4735]: I0317 03:54:06.792004 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqjzq\" (UniqueName: \"kubernetes.io/projected/833bc2e6-f71c-4b83-bf69-d37773c5b08a-kube-api-access-lqjzq\") on node \"crc\" DevicePath \"\"" Mar 17 03:54:07 crc kubenswrapper[4735]: I0317 03:54:07.080722 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" Mar 17 03:54:07 crc kubenswrapper[4735]: I0317 03:54:07.085188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561994-pbvfk" event={"ID":"833bc2e6-f71c-4b83-bf69-d37773c5b08a","Type":"ContainerDied","Data":"75f13baefbc65bc0be2f0dfd1837bc23f3e78db6b533da4c85edfe2cd1c22de9"} Mar 17 03:54:07 crc kubenswrapper[4735]: I0317 03:54:07.085230 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f13baefbc65bc0be2f0dfd1837bc23f3e78db6b533da4c85edfe2cd1c22de9" Mar 17 03:54:07 crc kubenswrapper[4735]: I0317 03:54:07.169352 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561988-dxklv"] Mar 17 03:54:07 crc kubenswrapper[4735]: I0317 03:54:07.179134 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561988-dxklv"] Mar 17 03:54:09 crc kubenswrapper[4735]: I0317 03:54:09.096392 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7c2d5f-ea6e-4d98-bacc-000724b141b7" path="/var/lib/kubelet/pods/0c7c2d5f-ea6e-4d98-bacc-000724b141b7/volumes" Mar 17 03:54:09 crc kubenswrapper[4735]: I0317 03:54:09.277525 4735 scope.go:117] "RemoveContainer" containerID="a329352668f63fd09723f7cfd5e72a65057333597253dde8949797362c8e8ff9" Mar 17 03:54:14 crc kubenswrapper[4735]: I0317 03:54:14.705305 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:54:14 crc kubenswrapper[4735]: I0317 03:54:14.825662 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:54:14 crc kubenswrapper[4735]: I0317 03:54:14.962264 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.267559 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fgptd" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" containerID="cri-o://f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7" gracePeriod=2 Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.812889 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.961906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content\") pod \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.962038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities\") pod \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.962156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kgf9\" (UniqueName: \"kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9\") pod \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\" (UID: \"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba\") " Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.962767 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities" (OuterVolumeSpecName: "utilities") pod "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" (UID: "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.964254 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:54:16 crc kubenswrapper[4735]: I0317 03:54:16.974194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9" (OuterVolumeSpecName: "kube-api-access-9kgf9") pod "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" (UID: "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba"). InnerVolumeSpecName "kube-api-access-9kgf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.067315 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kgf9\" (UniqueName: \"kubernetes.io/projected/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-kube-api-access-9kgf9\") on node \"crc\" DevicePath \"\"" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.105081 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" (UID: "d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.169383 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.282087 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerID="f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7" exitCode=0 Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.282135 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgptd" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.282167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerDied","Data":"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7"} Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.282223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgptd" event={"ID":"d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba","Type":"ContainerDied","Data":"bcaa63e182359aadd34ecfcfa56ce9d9bab2bdf6659ffe2c5e4c2311bb2623c5"} Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.282249 4735 scope.go:117] "RemoveContainer" containerID="f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.321780 4735 scope.go:117] "RemoveContainer" containerID="44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.347718 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.361472 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fgptd"] Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.367318 4735 scope.go:117] "RemoveContainer" containerID="09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.427530 4735 scope.go:117] "RemoveContainer" containerID="f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7" Mar 17 03:54:17 crc kubenswrapper[4735]: E0317 03:54:17.435907 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7\": container with ID starting with f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7 not found: ID does not exist" containerID="f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.435969 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7"} err="failed to get container status \"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7\": rpc error: code = NotFound desc = could not find container \"f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7\": container with ID starting with f05f47911a038ac32a2cf363b2b73deb7011c87ee828f3a667b3c46ba299fcf7 not found: ID does not exist" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.436003 4735 scope.go:117] "RemoveContainer" containerID="44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c" Mar 17 03:54:17 crc kubenswrapper[4735]: E0317 03:54:17.436903 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c\": container with ID starting with 44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c not found: ID does not exist" containerID="44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.436997 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c"} err="failed to get container status \"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c\": rpc error: code = NotFound desc = could not find container \"44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c\": container with ID starting with 44c8d31421b4d045800f3b4a2879d4ea59d6a0a986b776ea9bff8e4431f6420c not found: ID does not exist" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.437049 4735 scope.go:117] "RemoveContainer" containerID="09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3" Mar 17 03:54:17 crc kubenswrapper[4735]: E0317 03:54:17.437520 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3\": container with ID starting with 09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3 not found: ID does not exist" containerID="09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3" Mar 17 03:54:17 crc kubenswrapper[4735]: I0317 03:54:17.437555 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3"} err="failed to get container status \"09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3\": rpc error: code = NotFound desc = could not find container \"09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3\": container with ID starting with 09029360ba56e0e02f42db3475f8ca6b8eaccb29a1e1fa147bfca7edcccc51c3 not found: ID does not exist" Mar 17 03:54:19 crc kubenswrapper[4735]: I0317 03:54:19.090171 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" path="/var/lib/kubelet/pods/d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba/volumes" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.476334 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:32 crc kubenswrapper[4735]: E0317 03:55:32.478219 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="extract-utilities" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.478242 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="extract-utilities" Mar 17 03:55:32 crc kubenswrapper[4735]: E0317 03:55:32.478274 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.478282 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" Mar 17 03:55:32 crc kubenswrapper[4735]: E0317 03:55:32.478322 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833bc2e6-f71c-4b83-bf69-d37773c5b08a" containerName="oc" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.478335 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="833bc2e6-f71c-4b83-bf69-d37773c5b08a" containerName="oc" Mar 17 03:55:32 crc kubenswrapper[4735]: E0317 03:55:32.478347 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="extract-content" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.478354 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="extract-content" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.484911 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="833bc2e6-f71c-4b83-bf69-d37773c5b08a" containerName="oc" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.485060 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8331ef3-1fa6-4525-b6c3-676ce1b4a3ba" containerName="registry-server" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.494995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.505297 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.656878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.656996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.657128 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdq7\" (UniqueName: \"kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.759410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.759522 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdq7\" (UniqueName: \"kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.759625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.759891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.759982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.781137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdq7\" (UniqueName: \"kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7\") pod \"community-operators-mtr4g\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:32 crc kubenswrapper[4735]: I0317 03:55:32.832924 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:33 crc kubenswrapper[4735]: I0317 03:55:33.373404 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:33 crc kubenswrapper[4735]: I0317 03:55:33.541936 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerStarted","Data":"80c6c20ad471ad4c47726c217371650cfcbfa3e6ef16ff383547638093ef926e"} Mar 17 03:55:34 crc kubenswrapper[4735]: I0317 03:55:34.554774 4735 generic.go:334] "Generic (PLEG): container finished" podID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerID="c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca" exitCode=0 Mar 17 03:55:34 crc kubenswrapper[4735]: I0317 03:55:34.554833 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerDied","Data":"c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca"} Mar 17 03:55:34 crc kubenswrapper[4735]: I0317 03:55:34.558780 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 03:55:35 crc kubenswrapper[4735]: I0317 03:55:35.566442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerStarted","Data":"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a"} Mar 17 03:55:37 crc kubenswrapper[4735]: I0317 03:55:37.591736 4735 generic.go:334] "Generic (PLEG): container finished" podID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerID="71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a" exitCode=0 Mar 17 03:55:37 crc kubenswrapper[4735]: I0317 03:55:37.591837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerDied","Data":"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a"} Mar 17 03:55:38 crc kubenswrapper[4735]: I0317 03:55:38.605288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerStarted","Data":"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd"} Mar 17 03:55:42 crc kubenswrapper[4735]: I0317 03:55:42.606639 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:55:42 crc kubenswrapper[4735]: I0317 03:55:42.607121 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:55:42 crc kubenswrapper[4735]: I0317 03:55:42.834237 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:42 crc kubenswrapper[4735]: I0317 03:55:42.834582 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:44 crc kubenswrapper[4735]: I0317 03:55:44.093967 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mtr4g" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="registry-server" probeResult="failure" output=< Mar 17 03:55:44 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:55:44 crc kubenswrapper[4735]: > Mar 17 03:55:52 crc kubenswrapper[4735]: I0317 03:55:52.915760 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:52 crc kubenswrapper[4735]: I0317 03:55:52.956137 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mtr4g" podStartSLOduration=17.366589108 podStartE2EDuration="20.956108429s" podCreationTimestamp="2026-03-17 03:55:32 +0000 UTC" firstStartedPulling="2026-03-17 03:55:34.556959898 +0000 UTC m=+9960.189192876" lastFinishedPulling="2026-03-17 03:55:38.146479209 +0000 UTC m=+9963.778712197" observedRunningTime="2026-03-17 03:55:38.625284562 +0000 UTC m=+9964.257517540" watchObservedRunningTime="2026-03-17 03:55:52.956108429 +0000 UTC m=+9978.588341417" Mar 17 03:55:52 crc kubenswrapper[4735]: I0317 03:55:52.987689 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:53 crc kubenswrapper[4735]: I0317 03:55:53.178775 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:54 crc kubenswrapper[4735]: I0317 03:55:54.800197 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mtr4g" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="registry-server" containerID="cri-o://c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd" gracePeriod=2 Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.307383 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.402449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content\") pod \"b13ab1c2-63f7-4021-951a-32557fc2292a\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.402611 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qdq7\" (UniqueName: \"kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7\") pod \"b13ab1c2-63f7-4021-951a-32557fc2292a\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.402685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities\") pod \"b13ab1c2-63f7-4021-951a-32557fc2292a\" (UID: \"b13ab1c2-63f7-4021-951a-32557fc2292a\") " Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.403624 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities" (OuterVolumeSpecName: "utilities") pod "b13ab1c2-63f7-4021-951a-32557fc2292a" (UID: "b13ab1c2-63f7-4021-951a-32557fc2292a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.404324 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.413150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7" (OuterVolumeSpecName: "kube-api-access-7qdq7") pod "b13ab1c2-63f7-4021-951a-32557fc2292a" (UID: "b13ab1c2-63f7-4021-951a-32557fc2292a"). InnerVolumeSpecName "kube-api-access-7qdq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.455994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b13ab1c2-63f7-4021-951a-32557fc2292a" (UID: "b13ab1c2-63f7-4021-951a-32557fc2292a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.505657 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qdq7\" (UniqueName: \"kubernetes.io/projected/b13ab1c2-63f7-4021-951a-32557fc2292a-kube-api-access-7qdq7\") on node \"crc\" DevicePath \"\"" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.505686 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13ab1c2-63f7-4021-951a-32557fc2292a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.830069 4735 generic.go:334] "Generic (PLEG): container finished" podID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerID="c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd" exitCode=0 Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.830145 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtr4g" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.830136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerDied","Data":"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd"} Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.830775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtr4g" event={"ID":"b13ab1c2-63f7-4021-951a-32557fc2292a","Type":"ContainerDied","Data":"80c6c20ad471ad4c47726c217371650cfcbfa3e6ef16ff383547638093ef926e"} Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.830921 4735 scope.go:117] "RemoveContainer" containerID="c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.877217 4735 scope.go:117] "RemoveContainer" containerID="71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.879986 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.894092 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mtr4g"] Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.899115 4735 scope.go:117] "RemoveContainer" containerID="c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.962274 4735 scope.go:117] "RemoveContainer" containerID="c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd" Mar 17 03:55:55 crc kubenswrapper[4735]: E0317 03:55:55.963249 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd\": container with ID starting with c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd not found: ID does not exist" containerID="c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.963292 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd"} err="failed to get container status \"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd\": rpc error: code = NotFound desc = could not find container \"c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd\": container with ID starting with c766d3b36acfc946b325caf06767e27579067a15c95b30e56a51710c6e7f1edd not found: ID does not exist" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.963317 4735 scope.go:117] "RemoveContainer" containerID="71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a" Mar 17 03:55:55 crc kubenswrapper[4735]: E0317 03:55:55.963776 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a\": container with ID starting with 71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a not found: ID does not exist" containerID="71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.963815 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a"} err="failed to get container status \"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a\": rpc error: code = NotFound desc = could not find container \"71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a\": container with ID starting with 71e6c9c768fcb5ed6ecd541e241d6f625df315291649a5b2b1a5116af984b58a not found: ID does not exist" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.963840 4735 scope.go:117] "RemoveContainer" containerID="c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca" Mar 17 03:55:55 crc kubenswrapper[4735]: E0317 03:55:55.964193 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca\": container with ID starting with c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca not found: ID does not exist" containerID="c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca" Mar 17 03:55:55 crc kubenswrapper[4735]: I0317 03:55:55.964216 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca"} err="failed to get container status \"c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca\": rpc error: code = NotFound desc = could not find container \"c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca\": container with ID starting with c19246377976a3e839819edccaab6e19aeb82a28fcc71609668ad80687cc6eca not found: ID does not exist" Mar 17 03:55:57 crc kubenswrapper[4735]: I0317 03:55:57.092136 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" path="/var/lib/kubelet/pods/b13ab1c2-63f7-4021-951a-32557fc2292a/volumes" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.154726 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561996-snxch"] Mar 17 03:56:00 crc kubenswrapper[4735]: E0317 03:56:00.155895 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="extract-utilities" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.155916 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="extract-utilities" Mar 17 03:56:00 crc kubenswrapper[4735]: E0317 03:56:00.155951 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="registry-server" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.155963 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="registry-server" Mar 17 03:56:00 crc kubenswrapper[4735]: E0317 03:56:00.156000 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="extract-content" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.156013 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="extract-content" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.156333 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13ab1c2-63f7-4021-951a-32557fc2292a" containerName="registry-server" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.157312 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.158916 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.159302 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.159468 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.176960 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561996-snxch"] Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.315674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht8dp\" (UniqueName: \"kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp\") pod \"auto-csr-approver-29561996-snxch\" (UID: \"b930ea0a-8da0-437c-9934-a2b9f694d031\") " pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.418096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8dp\" (UniqueName: \"kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp\") pod \"auto-csr-approver-29561996-snxch\" (UID: \"b930ea0a-8da0-437c-9934-a2b9f694d031\") " pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.441773 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8dp\" (UniqueName: \"kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp\") pod \"auto-csr-approver-29561996-snxch\" (UID: \"b930ea0a-8da0-437c-9934-a2b9f694d031\") " pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.483762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:00 crc kubenswrapper[4735]: I0317 03:56:00.983129 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561996-snxch"] Mar 17 03:56:01 crc kubenswrapper[4735]: I0317 03:56:01.895445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561996-snxch" event={"ID":"b930ea0a-8da0-437c-9934-a2b9f694d031","Type":"ContainerStarted","Data":"e2050e69997bfce7bac9c817c7a87c1e078ba975dc7c9028dc0c216f9fef587d"} Mar 17 03:56:02 crc kubenswrapper[4735]: I0317 03:56:02.909447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561996-snxch" event={"ID":"b930ea0a-8da0-437c-9934-a2b9f694d031","Type":"ContainerStarted","Data":"521fa6a8a1ece3cf8d78f52a1ac0e3af7dd36fe51356bc5025a56e6730f5bcd6"} Mar 17 03:56:02 crc kubenswrapper[4735]: I0317 03:56:02.934204 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561996-snxch" podStartSLOduration=1.828025357 podStartE2EDuration="2.934176471s" podCreationTimestamp="2026-03-17 03:56:00 +0000 UTC" firstStartedPulling="2026-03-17 03:56:00.990077633 +0000 UTC m=+9986.622310611" lastFinishedPulling="2026-03-17 03:56:02.096228707 +0000 UTC m=+9987.728461725" observedRunningTime="2026-03-17 03:56:02.925640266 +0000 UTC m=+9988.557873264" watchObservedRunningTime="2026-03-17 03:56:02.934176471 +0000 UTC m=+9988.566409499" Mar 17 03:56:03 crc kubenswrapper[4735]: I0317 03:56:03.918464 4735 generic.go:334] "Generic (PLEG): container finished" podID="b930ea0a-8da0-437c-9934-a2b9f694d031" containerID="521fa6a8a1ece3cf8d78f52a1ac0e3af7dd36fe51356bc5025a56e6730f5bcd6" exitCode=0 Mar 17 03:56:03 crc kubenswrapper[4735]: I0317 03:56:03.918744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561996-snxch" event={"ID":"b930ea0a-8da0-437c-9934-a2b9f694d031","Type":"ContainerDied","Data":"521fa6a8a1ece3cf8d78f52a1ac0e3af7dd36fe51356bc5025a56e6730f5bcd6"} Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.288239 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.414824 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht8dp\" (UniqueName: \"kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp\") pod \"b930ea0a-8da0-437c-9934-a2b9f694d031\" (UID: \"b930ea0a-8da0-437c-9934-a2b9f694d031\") " Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.421320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp" (OuterVolumeSpecName: "kube-api-access-ht8dp") pod "b930ea0a-8da0-437c-9934-a2b9f694d031" (UID: "b930ea0a-8da0-437c-9934-a2b9f694d031"). InnerVolumeSpecName "kube-api-access-ht8dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.517171 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht8dp\" (UniqueName: \"kubernetes.io/projected/b930ea0a-8da0-437c-9934-a2b9f694d031-kube-api-access-ht8dp\") on node \"crc\" DevicePath \"\"" Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.937122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561996-snxch" event={"ID":"b930ea0a-8da0-437c-9934-a2b9f694d031","Type":"ContainerDied","Data":"e2050e69997bfce7bac9c817c7a87c1e078ba975dc7c9028dc0c216f9fef587d"} Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.937160 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561996-snxch" Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.937171 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2050e69997bfce7bac9c817c7a87c1e078ba975dc7c9028dc0c216f9fef587d" Mar 17 03:56:05 crc kubenswrapper[4735]: I0317 03:56:05.999832 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561990-dq5bb"] Mar 17 03:56:06 crc kubenswrapper[4735]: I0317 03:56:06.038361 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561990-dq5bb"] Mar 17 03:56:07 crc kubenswrapper[4735]: I0317 03:56:07.088153 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f03e13-4ca9-432d-b6dc-cb707d75556b" path="/var/lib/kubelet/pods/b5f03e13-4ca9-432d-b6dc-cb707d75556b/volumes" Mar 17 03:56:09 crc kubenswrapper[4735]: I0317 03:56:09.466785 4735 scope.go:117] "RemoveContainer" containerID="a03488f2cc78d31ac531e9c8807c161559420f999a3b72d770879718c956544d" Mar 17 03:56:12 crc kubenswrapper[4735]: I0317 03:56:12.606330 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:56:12 crc kubenswrapper[4735]: I0317 03:56:12.606791 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:56:42 crc kubenswrapper[4735]: I0317 03:56:42.606713 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 03:56:42 crc kubenswrapper[4735]: I0317 03:56:42.607206 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 03:56:42 crc kubenswrapper[4735]: I0317 03:56:42.607248 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 03:56:42 crc kubenswrapper[4735]: I0317 03:56:42.608437 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 03:56:42 crc kubenswrapper[4735]: I0317 03:56:42.608500 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" gracePeriod=600 Mar 17 03:56:42 crc kubenswrapper[4735]: E0317 03:56:42.758751 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:56:43 crc kubenswrapper[4735]: I0317 03:56:43.312496 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" exitCode=0 Mar 17 03:56:43 crc kubenswrapper[4735]: I0317 03:56:43.312548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487"} Mar 17 03:56:43 crc kubenswrapper[4735]: I0317 03:56:43.312588 4735 scope.go:117] "RemoveContainer" containerID="fec9e2e24648c9533863dfbc1a681fa712a911fd320d183d7d4a5ff812b9d553" Mar 17 03:56:43 crc kubenswrapper[4735]: I0317 03:56:43.313134 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:56:43 crc kubenswrapper[4735]: E0317 03:56:43.313401 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:56:54 crc kubenswrapper[4735]: I0317 03:56:54.072879 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:56:54 crc kubenswrapper[4735]: E0317 03:56:54.073487 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:57:06 crc kubenswrapper[4735]: I0317 03:57:06.074016 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:57:06 crc kubenswrapper[4735]: E0317 03:57:06.074799 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:57:21 crc kubenswrapper[4735]: I0317 03:57:21.072849 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:57:21 crc kubenswrapper[4735]: E0317 03:57:21.073576 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:57:32 crc kubenswrapper[4735]: I0317 03:57:32.073016 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:57:32 crc kubenswrapper[4735]: E0317 03:57:32.074677 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.713189 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:57:40 crc kubenswrapper[4735]: E0317 03:57:40.716507 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b930ea0a-8da0-437c-9934-a2b9f694d031" containerName="oc" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.716570 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b930ea0a-8da0-437c-9934-a2b9f694d031" containerName="oc" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.718132 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b930ea0a-8da0-437c-9934-a2b9f694d031" containerName="oc" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.721819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.747840 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.881407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.881476 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.881540 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlc9\" (UniqueName: \"kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.983789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.983846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.983896 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlc9\" (UniqueName: \"kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.985343 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:40 crc kubenswrapper[4735]: I0317 03:57:40.986060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:41 crc kubenswrapper[4735]: I0317 03:57:41.019602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlc9\" (UniqueName: \"kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9\") pod \"redhat-marketplace-xjcn8\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:41 crc kubenswrapper[4735]: I0317 03:57:41.053481 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.312586 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.468966 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.471876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.500275 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.618903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.619217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.619261 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgz4\" (UniqueName: \"kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.721829 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.722234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szgz4\" (UniqueName: \"kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.722410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.722410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.723184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.744514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgz4\" (UniqueName: \"kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4\") pod \"certified-operators-n5mwn\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.795461 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.976416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerDied","Data":"45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65"} Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.986964 4735 generic.go:334] "Generic (PLEG): container finished" podID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerID="45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65" exitCode=0 Mar 17 03:57:42 crc kubenswrapper[4735]: I0317 03:57:42.987046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerStarted","Data":"c88d8513eea2727595daceaace8bf7378265c5dc945832ca74732127da99d464"} Mar 17 03:57:43 crc kubenswrapper[4735]: I0317 03:57:43.076387 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:57:43 crc kubenswrapper[4735]: E0317 03:57:43.077220 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:57:43 crc kubenswrapper[4735]: I0317 03:57:43.351678 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:57:43 crc kubenswrapper[4735]: W0317 03:57:43.360306 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f67eba2_a294_4428_973e_f2ae16e9d88f.slice/crio-d8c3ad578a30a4e6654d0fcb5a78b9649a0499e5e96ecb1a9c42610f3c172e10 WatchSource:0}: Error finding container d8c3ad578a30a4e6654d0fcb5a78b9649a0499e5e96ecb1a9c42610f3c172e10: Status 404 returned error can't find the container with id d8c3ad578a30a4e6654d0fcb5a78b9649a0499e5e96ecb1a9c42610f3c172e10 Mar 17 03:57:43 crc kubenswrapper[4735]: I0317 03:57:43.998543 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerID="63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937" exitCode=0 Mar 17 03:57:43 crc kubenswrapper[4735]: I0317 03:57:43.998617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerDied","Data":"63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937"} Mar 17 03:57:43 crc kubenswrapper[4735]: I0317 03:57:43.998669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerStarted","Data":"d8c3ad578a30a4e6654d0fcb5a78b9649a0499e5e96ecb1a9c42610f3c172e10"} Mar 17 03:57:45 crc kubenswrapper[4735]: I0317 03:57:45.013989 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerStarted","Data":"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463"} Mar 17 03:57:45 crc kubenswrapper[4735]: I0317 03:57:45.022209 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerStarted","Data":"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003"} Mar 17 03:57:46 crc kubenswrapper[4735]: I0317 03:57:46.035744 4735 generic.go:334] "Generic (PLEG): container finished" podID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerID="907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463" exitCode=0 Mar 17 03:57:46 crc kubenswrapper[4735]: I0317 03:57:46.036053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerDied","Data":"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463"} Mar 17 03:57:48 crc kubenswrapper[4735]: I0317 03:57:48.059721 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerStarted","Data":"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7"} Mar 17 03:57:48 crc kubenswrapper[4735]: I0317 03:57:48.063562 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerID="e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003" exitCode=0 Mar 17 03:57:48 crc kubenswrapper[4735]: I0317 03:57:48.063604 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerDied","Data":"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003"} Mar 17 03:57:48 crc kubenswrapper[4735]: I0317 03:57:48.087744 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjcn8" podStartSLOduration=4.576402579 podStartE2EDuration="8.08665075s" podCreationTimestamp="2026-03-17 03:57:40 +0000 UTC" firstStartedPulling="2026-03-17 03:57:43.013188356 +0000 UTC m=+10088.645421334" lastFinishedPulling="2026-03-17 03:57:46.523436527 +0000 UTC m=+10092.155669505" observedRunningTime="2026-03-17 03:57:48.084270142 +0000 UTC m=+10093.716503130" watchObservedRunningTime="2026-03-17 03:57:48.08665075 +0000 UTC m=+10093.718883738" Mar 17 03:57:49 crc kubenswrapper[4735]: I0317 03:57:49.092399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerStarted","Data":"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db"} Mar 17 03:57:49 crc kubenswrapper[4735]: I0317 03:57:49.113068 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n5mwn" podStartSLOduration=2.655614196 podStartE2EDuration="7.113049902s" podCreationTimestamp="2026-03-17 03:57:42 +0000 UTC" firstStartedPulling="2026-03-17 03:57:44.003795706 +0000 UTC m=+10089.636028734" lastFinishedPulling="2026-03-17 03:57:48.461231462 +0000 UTC m=+10094.093464440" observedRunningTime="2026-03-17 03:57:49.111448243 +0000 UTC m=+10094.743681221" watchObservedRunningTime="2026-03-17 03:57:49.113049902 +0000 UTC m=+10094.745282880" Mar 17 03:57:51 crc kubenswrapper[4735]: I0317 03:57:51.054617 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:51 crc kubenswrapper[4735]: I0317 03:57:51.054951 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:57:52 crc kubenswrapper[4735]: I0317 03:57:52.099795 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xjcn8" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="registry-server" probeResult="failure" output=< Mar 17 03:57:52 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:57:52 crc kubenswrapper[4735]: > Mar 17 03:57:52 crc kubenswrapper[4735]: I0317 03:57:52.796240 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:52 crc kubenswrapper[4735]: I0317 03:57:52.796600 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:57:53 crc kubenswrapper[4735]: I0317 03:57:53.867213 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-n5mwn" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="registry-server" probeResult="failure" output=< Mar 17 03:57:53 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 03:57:53 crc kubenswrapper[4735]: > Mar 17 03:57:55 crc kubenswrapper[4735]: I0317 03:57:55.079107 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:57:55 crc kubenswrapper[4735]: E0317 03:57:55.079660 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.303679 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561998-jkmdf"] Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.305547 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.320378 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.320421 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561998-jkmdf"] Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.320494 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.376870 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.502720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9lm\" (UniqueName: \"kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm\") pod \"auto-csr-approver-29561998-jkmdf\" (UID: \"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6\") " pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.606023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9lm\" (UniqueName: \"kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm\") pod \"auto-csr-approver-29561998-jkmdf\" (UID: \"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6\") " pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.636368 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9lm\" (UniqueName: \"kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm\") pod \"auto-csr-approver-29561998-jkmdf\" (UID: \"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6\") " pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:00 crc kubenswrapper[4735]: I0317 03:58:00.660607 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:01 crc kubenswrapper[4735]: I0317 03:58:01.110950 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:58:01 crc kubenswrapper[4735]: I0317 03:58:01.157030 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561998-jkmdf"] Mar 17 03:58:01 crc kubenswrapper[4735]: I0317 03:58:01.184339 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:58:01 crc kubenswrapper[4735]: I0317 03:58:01.194079 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" event={"ID":"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6","Type":"ContainerStarted","Data":"146dbea1fc1e766bf31b39a3c521ffe2dea651c3686cb0aa20c591ac870c80c2"} Mar 17 03:58:01 crc kubenswrapper[4735]: I0317 03:58:01.347202 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.206096 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjcn8" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="registry-server" containerID="cri-o://3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7" gracePeriod=2 Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.777198 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.861104 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities\") pod \"df2e306b-a192-4a55-a172-cdab1ce9747d\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.861496 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqlc9\" (UniqueName: \"kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9\") pod \"df2e306b-a192-4a55-a172-cdab1ce9747d\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.861647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content\") pod \"df2e306b-a192-4a55-a172-cdab1ce9747d\" (UID: \"df2e306b-a192-4a55-a172-cdab1ce9747d\") " Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.864917 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities" (OuterVolumeSpecName: "utilities") pod "df2e306b-a192-4a55-a172-cdab1ce9747d" (UID: "df2e306b-a192-4a55-a172-cdab1ce9747d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.872206 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.873085 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9" (OuterVolumeSpecName: "kube-api-access-rqlc9") pod "df2e306b-a192-4a55-a172-cdab1ce9747d" (UID: "df2e306b-a192-4a55-a172-cdab1ce9747d"). InnerVolumeSpecName "kube-api-access-rqlc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.900437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df2e306b-a192-4a55-a172-cdab1ce9747d" (UID: "df2e306b-a192-4a55-a172-cdab1ce9747d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.934569 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.964025 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqlc9\" (UniqueName: \"kubernetes.io/projected/df2e306b-a192-4a55-a172-cdab1ce9747d-kube-api-access-rqlc9\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.964064 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:02 crc kubenswrapper[4735]: I0317 03:58:02.964077 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2e306b-a192-4a55-a172-cdab1ce9747d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.215445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" event={"ID":"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6","Type":"ContainerStarted","Data":"5853a9621103209c6166d1286ee3b1c4a553137f7c11cdffe27f80ebd9c9064a"} Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.218353 4735 generic.go:334] "Generic (PLEG): container finished" podID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerID="3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7" exitCode=0 Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.218435 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjcn8" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.218508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerDied","Data":"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7"} Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.218550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjcn8" event={"ID":"df2e306b-a192-4a55-a172-cdab1ce9747d","Type":"ContainerDied","Data":"c88d8513eea2727595daceaace8bf7378265c5dc945832ca74732127da99d464"} Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.218567 4735 scope.go:117] "RemoveContainer" containerID="3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.242296 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" podStartSLOduration=2.124089899 podStartE2EDuration="3.242274563s" podCreationTimestamp="2026-03-17 03:58:00 +0000 UTC" firstStartedPulling="2026-03-17 03:58:01.166419532 +0000 UTC m=+10106.798652510" lastFinishedPulling="2026-03-17 03:58:02.284604156 +0000 UTC m=+10107.916837174" observedRunningTime="2026-03-17 03:58:03.233660735 +0000 UTC m=+10108.865893713" watchObservedRunningTime="2026-03-17 03:58:03.242274563 +0000 UTC m=+10108.874507571" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.242735 4735 scope.go:117] "RemoveContainer" containerID="907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.261660 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.272761 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjcn8"] Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.280336 4735 scope.go:117] "RemoveContainer" containerID="45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.304028 4735 scope.go:117] "RemoveContainer" containerID="3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7" Mar 17 03:58:03 crc kubenswrapper[4735]: E0317 03:58:03.306960 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7\": container with ID starting with 3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7 not found: ID does not exist" containerID="3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.307540 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7"} err="failed to get container status \"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7\": rpc error: code = NotFound desc = could not find container \"3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7\": container with ID starting with 3614d15ec24166cbaf468fea809a599b8f24f9c06d1e9e4921ddba05bce9afc7 not found: ID does not exist" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.307573 4735 scope.go:117] "RemoveContainer" containerID="907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463" Mar 17 03:58:03 crc kubenswrapper[4735]: E0317 03:58:03.308194 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463\": container with ID starting with 907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463 not found: ID does not exist" containerID="907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.308222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463"} err="failed to get container status \"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463\": rpc error: code = NotFound desc = could not find container \"907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463\": container with ID starting with 907f8f9488723fed478dd08d0bdd12a69e2d648062962bee704e3109348e0463 not found: ID does not exist" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.308235 4735 scope.go:117] "RemoveContainer" containerID="45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65" Mar 17 03:58:03 crc kubenswrapper[4735]: E0317 03:58:03.308563 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65\": container with ID starting with 45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65 not found: ID does not exist" containerID="45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65" Mar 17 03:58:03 crc kubenswrapper[4735]: I0317 03:58:03.308587 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65"} err="failed to get container status \"45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65\": rpc error: code = NotFound desc = could not find container \"45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65\": container with ID starting with 45cb734a7eb5bace6486f3ada876ab213ba885515cebe26ecc605b756077ad65 not found: ID does not exist" Mar 17 03:58:04 crc kubenswrapper[4735]: I0317 03:58:04.230488 4735 generic.go:334] "Generic (PLEG): container finished" podID="acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" containerID="5853a9621103209c6166d1286ee3b1c4a553137f7c11cdffe27f80ebd9c9064a" exitCode=0 Mar 17 03:58:04 crc kubenswrapper[4735]: I0317 03:58:04.230563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" event={"ID":"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6","Type":"ContainerDied","Data":"5853a9621103209c6166d1286ee3b1c4a553137f7c11cdffe27f80ebd9c9064a"} Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.099443 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" path="/var/lib/kubelet/pods/df2e306b-a192-4a55-a172-cdab1ce9747d/volumes" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.150781 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.151729 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n5mwn" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="registry-server" containerID="cri-o://e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db" gracePeriod=2 Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.625812 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.725002 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9lm\" (UniqueName: \"kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm\") pod \"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6\" (UID: \"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6\") " Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.730632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm" (OuterVolumeSpecName: "kube-api-access-bh9lm") pod "acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" (UID: "acd4e003-fc98-4302-b4a0-f3f25c7fb9a6"). InnerVolumeSpecName "kube-api-access-bh9lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.748782 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.826851 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities\") pod \"2f67eba2-a294-4428-973e-f2ae16e9d88f\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.826954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content\") pod \"2f67eba2-a294-4428-973e-f2ae16e9d88f\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.826997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szgz4\" (UniqueName: \"kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4\") pod \"2f67eba2-a294-4428-973e-f2ae16e9d88f\" (UID: \"2f67eba2-a294-4428-973e-f2ae16e9d88f\") " Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.827366 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9lm\" (UniqueName: \"kubernetes.io/projected/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6-kube-api-access-bh9lm\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.827709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities" (OuterVolumeSpecName: "utilities") pod "2f67eba2-a294-4428-973e-f2ae16e9d88f" (UID: "2f67eba2-a294-4428-973e-f2ae16e9d88f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.830296 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4" (OuterVolumeSpecName: "kube-api-access-szgz4") pod "2f67eba2-a294-4428-973e-f2ae16e9d88f" (UID: "2f67eba2-a294-4428-973e-f2ae16e9d88f"). InnerVolumeSpecName "kube-api-access-szgz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.877974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f67eba2-a294-4428-973e-f2ae16e9d88f" (UID: "2f67eba2-a294-4428-973e-f2ae16e9d88f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.929260 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.929286 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f67eba2-a294-4428-973e-f2ae16e9d88f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:05 crc kubenswrapper[4735]: I0317 03:58:05.929297 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szgz4\" (UniqueName: \"kubernetes.io/projected/2f67eba2-a294-4428-973e-f2ae16e9d88f-kube-api-access-szgz4\") on node \"crc\" DevicePath \"\"" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.250505 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.250590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561998-jkmdf" event={"ID":"acd4e003-fc98-4302-b4a0-f3f25c7fb9a6","Type":"ContainerDied","Data":"146dbea1fc1e766bf31b39a3c521ffe2dea651c3686cb0aa20c591ac870c80c2"} Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.250637 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146dbea1fc1e766bf31b39a3c521ffe2dea651c3686cb0aa20c591ac870c80c2" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.253439 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerID="e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db" exitCode=0 Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.253478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerDied","Data":"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db"} Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.253505 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5mwn" event={"ID":"2f67eba2-a294-4428-973e-f2ae16e9d88f","Type":"ContainerDied","Data":"d8c3ad578a30a4e6654d0fcb5a78b9649a0499e5e96ecb1a9c42610f3c172e10"} Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.253521 4735 scope.go:117] "RemoveContainer" containerID="e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.253534 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5mwn" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.280358 4735 scope.go:117] "RemoveContainer" containerID="e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.341206 4735 scope.go:117] "RemoveContainer" containerID="63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.344017 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561992-pppsp"] Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.356019 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561992-pppsp"] Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.378129 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.387919 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n5mwn"] Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.388525 4735 scope.go:117] "RemoveContainer" containerID="e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db" Mar 17 03:58:06 crc kubenswrapper[4735]: E0317 03:58:06.389004 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db\": container with ID starting with e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db not found: ID does not exist" containerID="e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.389031 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db"} err="failed to get container status \"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db\": rpc error: code = NotFound desc = could not find container \"e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db\": container with ID starting with e7c7360f24971528276c601508c9002d8acd8a72756cc9b935d9daf28d20a9db not found: ID does not exist" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.389049 4735 scope.go:117] "RemoveContainer" containerID="e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003" Mar 17 03:58:06 crc kubenswrapper[4735]: E0317 03:58:06.389240 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003\": container with ID starting with e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003 not found: ID does not exist" containerID="e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.389260 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003"} err="failed to get container status \"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003\": rpc error: code = NotFound desc = could not find container \"e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003\": container with ID starting with e422a799d119d714922463a5bcd626fe7b961ed7f5ba92412bf05b0e4521a003 not found: ID does not exist" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.389272 4735 scope.go:117] "RemoveContainer" containerID="63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937" Mar 17 03:58:06 crc kubenswrapper[4735]: E0317 03:58:06.389456 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937\": container with ID starting with 63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937 not found: ID does not exist" containerID="63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937" Mar 17 03:58:06 crc kubenswrapper[4735]: I0317 03:58:06.389476 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937"} err="failed to get container status \"63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937\": rpc error: code = NotFound desc = could not find container \"63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937\": container with ID starting with 63c83842642e2e9d49c1a663e2b6f2a4c82f77837257672a31c26a5406e55937 not found: ID does not exist" Mar 17 03:58:07 crc kubenswrapper[4735]: I0317 03:58:07.076964 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:58:07 crc kubenswrapper[4735]: E0317 03:58:07.077560 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:58:07 crc kubenswrapper[4735]: I0317 03:58:07.083626 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" path="/var/lib/kubelet/pods/2f67eba2-a294-4428-973e-f2ae16e9d88f/volumes" Mar 17 03:58:07 crc kubenswrapper[4735]: I0317 03:58:07.084670 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3263becc-fa75-443e-a544-9b021916eaf5" path="/var/lib/kubelet/pods/3263becc-fa75-443e-a544-9b021916eaf5/volumes" Mar 17 03:58:09 crc kubenswrapper[4735]: I0317 03:58:09.727739 4735 scope.go:117] "RemoveContainer" containerID="94fd84b18f09ba4e3d766483b532d464a2e7fafd4bf5cb978ef89abd40adf66c" Mar 17 03:58:21 crc kubenswrapper[4735]: I0317 03:58:21.073533 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:58:21 crc kubenswrapper[4735]: E0317 03:58:21.074415 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:58:33 crc kubenswrapper[4735]: I0317 03:58:33.072926 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:58:33 crc kubenswrapper[4735]: E0317 03:58:33.073702 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:58:46 crc kubenswrapper[4735]: I0317 03:58:46.073422 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:58:46 crc kubenswrapper[4735]: E0317 03:58:46.074298 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:58:58 crc kubenswrapper[4735]: I0317 03:58:58.073322 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:58:58 crc kubenswrapper[4735]: E0317 03:58:58.074142 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:59:10 crc kubenswrapper[4735]: I0317 03:59:10.073263 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:59:10 crc kubenswrapper[4735]: E0317 03:59:10.074303 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:59:21 crc kubenswrapper[4735]: I0317 03:59:21.073065 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:59:21 crc kubenswrapper[4735]: E0317 03:59:21.073888 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:59:35 crc kubenswrapper[4735]: I0317 03:59:35.089610 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:59:35 crc kubenswrapper[4735]: E0317 03:59:35.092161 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 03:59:49 crc kubenswrapper[4735]: I0317 03:59:49.073093 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 03:59:49 crc kubenswrapper[4735]: E0317 03:59:49.075627 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.175074 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562000-rnl52"] Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176084 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176098 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176111 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" containerName="oc" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176117 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" containerName="oc" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176140 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="extract-utilities" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176148 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="extract-utilities" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176164 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176169 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176178 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="extract-content" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176184 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="extract-content" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176191 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="extract-utilities" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176196 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="extract-utilities" Mar 17 04:00:00 crc kubenswrapper[4735]: E0317 04:00:00.176216 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="extract-content" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176222 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="extract-content" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176395 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" containerName="oc" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176422 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2e306b-a192-4a55-a172-cdab1ce9747d" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.176445 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f67eba2-a294-4428-973e-f2ae16e9d88f" containerName="registry-server" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.177065 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.181347 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.181844 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.181960 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.223550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z"] Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.227221 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.228590 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562000-rnl52"] Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.233934 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.234110 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.254854 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z"] Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.338344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgmcq\" (UniqueName: \"kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.338421 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bqsr\" (UniqueName: \"kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr\") pod \"auto-csr-approver-29562000-rnl52\" (UID: \"dafdc698-3758-49cd-bc73-1d4e51eefc46\") " pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.338543 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.338605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.440416 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.440520 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgmcq\" (UniqueName: \"kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.440545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bqsr\" (UniqueName: \"kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr\") pod \"auto-csr-approver-29562000-rnl52\" (UID: \"dafdc698-3758-49cd-bc73-1d4e51eefc46\") " pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.440623 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.441519 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.459443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.466741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bqsr\" (UniqueName: \"kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr\") pod \"auto-csr-approver-29562000-rnl52\" (UID: \"dafdc698-3758-49cd-bc73-1d4e51eefc46\") " pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.472218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgmcq\" (UniqueName: \"kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq\") pod \"collect-profiles-29562000-6h24z\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.511740 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:00 crc kubenswrapper[4735]: I0317 04:00:00.551167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.073936 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:00:01 crc kubenswrapper[4735]: E0317 04:00:01.074582 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.091060 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z"] Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.096918 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562000-rnl52"] Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.437271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" event={"ID":"e5fb6e8a-af68-4b6d-bc73-214ad6a09657","Type":"ContainerStarted","Data":"d5b5072e558b87e92ec1a0e0b48a57be18ceed3d6e3694173a9b24a1d42d2512"} Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.437578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" event={"ID":"e5fb6e8a-af68-4b6d-bc73-214ad6a09657","Type":"ContainerStarted","Data":"ddbf86b386c40b95c9f69be86a0aec17137e8b3d4d54ce9c77aaafb6fcb8cc9d"} Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.441278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562000-rnl52" event={"ID":"dafdc698-3758-49cd-bc73-1d4e51eefc46","Type":"ContainerStarted","Data":"fe27aac7a0233989faf1ecbd5b200f0ded285d79bac480e44f965f75b133f3fb"} Mar 17 04:00:01 crc kubenswrapper[4735]: I0317 04:00:01.452666 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" podStartSLOduration=1.452646378 podStartE2EDuration="1.452646378s" podCreationTimestamp="2026-03-17 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:00:01.45149863 +0000 UTC m=+10227.083731608" watchObservedRunningTime="2026-03-17 04:00:01.452646378 +0000 UTC m=+10227.084879356" Mar 17 04:00:02 crc kubenswrapper[4735]: I0317 04:00:02.450596 4735 generic.go:334] "Generic (PLEG): container finished" podID="e5fb6e8a-af68-4b6d-bc73-214ad6a09657" containerID="d5b5072e558b87e92ec1a0e0b48a57be18ceed3d6e3694173a9b24a1d42d2512" exitCode=0 Mar 17 04:00:02 crc kubenswrapper[4735]: I0317 04:00:02.450720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" event={"ID":"e5fb6e8a-af68-4b6d-bc73-214ad6a09657","Type":"ContainerDied","Data":"d5b5072e558b87e92ec1a0e0b48a57be18ceed3d6e3694173a9b24a1d42d2512"} Mar 17 04:00:03 crc kubenswrapper[4735]: I0317 04:00:03.909629 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.029680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume\") pod \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.029719 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgmcq\" (UniqueName: \"kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq\") pod \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.029919 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume\") pod \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\" (UID: \"e5fb6e8a-af68-4b6d-bc73-214ad6a09657\") " Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.030328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5fb6e8a-af68-4b6d-bc73-214ad6a09657" (UID: "e5fb6e8a-af68-4b6d-bc73-214ad6a09657"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.035965 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5fb6e8a-af68-4b6d-bc73-214ad6a09657" (UID: "e5fb6e8a-af68-4b6d-bc73-214ad6a09657"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.036068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq" (OuterVolumeSpecName: "kube-api-access-xgmcq") pod "e5fb6e8a-af68-4b6d-bc73-214ad6a09657" (UID: "e5fb6e8a-af68-4b6d-bc73-214ad6a09657"). InnerVolumeSpecName "kube-api-access-xgmcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.133284 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.133353 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.133419 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgmcq\" (UniqueName: \"kubernetes.io/projected/e5fb6e8a-af68-4b6d-bc73-214ad6a09657-kube-api-access-xgmcq\") on node \"crc\" DevicePath \"\"" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.502727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" event={"ID":"e5fb6e8a-af68-4b6d-bc73-214ad6a09657","Type":"ContainerDied","Data":"ddbf86b386c40b95c9f69be86a0aec17137e8b3d4d54ce9c77aaafb6fcb8cc9d"} Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.502770 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbf86b386c40b95c9f69be86a0aec17137e8b3d4d54ce9c77aaafb6fcb8cc9d" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.502830 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562000-6h24z" Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.543681 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx"] Mar 17 04:00:04 crc kubenswrapper[4735]: I0317 04:00:04.554145 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561955-mmfvx"] Mar 17 04:00:05 crc kubenswrapper[4735]: I0317 04:00:05.092627 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecf8c9f-f094-4249-8377-28d3fa0cf02a" path="/var/lib/kubelet/pods/0ecf8c9f-f094-4249-8377-28d3fa0cf02a/volumes" Mar 17 04:00:09 crc kubenswrapper[4735]: I0317 04:00:09.947306 4735 scope.go:117] "RemoveContainer" containerID="177f37481e6f4d4b57b9b837f3ae5611c0bbd75c8ded87ed360d29c984a03329" Mar 17 04:00:16 crc kubenswrapper[4735]: I0317 04:00:16.073393 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:00:16 crc kubenswrapper[4735]: E0317 04:00:16.074336 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:00:23 crc kubenswrapper[4735]: I0317 04:00:23.686243 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562000-rnl52" event={"ID":"dafdc698-3758-49cd-bc73-1d4e51eefc46","Type":"ContainerStarted","Data":"0e4e887af998f927634dd02f414bbc01dfe30de53ced95d31ac6fff5ecc57f9d"} Mar 17 04:00:23 crc kubenswrapper[4735]: I0317 04:00:23.713121 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562000-rnl52" podStartSLOduration=2.052925138 podStartE2EDuration="23.713096826s" podCreationTimestamp="2026-03-17 04:00:00 +0000 UTC" firstStartedPulling="2026-03-17 04:00:01.131170975 +0000 UTC m=+10226.763403953" lastFinishedPulling="2026-03-17 04:00:22.791342633 +0000 UTC m=+10248.423575641" observedRunningTime="2026-03-17 04:00:23.702205694 +0000 UTC m=+10249.334438682" watchObservedRunningTime="2026-03-17 04:00:23.713096826 +0000 UTC m=+10249.345329814" Mar 17 04:00:24 crc kubenswrapper[4735]: I0317 04:00:24.699355 4735 generic.go:334] "Generic (PLEG): container finished" podID="dafdc698-3758-49cd-bc73-1d4e51eefc46" containerID="0e4e887af998f927634dd02f414bbc01dfe30de53ced95d31ac6fff5ecc57f9d" exitCode=0 Mar 17 04:00:24 crc kubenswrapper[4735]: I0317 04:00:24.699418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562000-rnl52" event={"ID":"dafdc698-3758-49cd-bc73-1d4e51eefc46","Type":"ContainerDied","Data":"0e4e887af998f927634dd02f414bbc01dfe30de53ced95d31ac6fff5ecc57f9d"} Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.254615 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.316023 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bqsr\" (UniqueName: \"kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr\") pod \"dafdc698-3758-49cd-bc73-1d4e51eefc46\" (UID: \"dafdc698-3758-49cd-bc73-1d4e51eefc46\") " Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.337001 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr" (OuterVolumeSpecName: "kube-api-access-7bqsr") pod "dafdc698-3758-49cd-bc73-1d4e51eefc46" (UID: "dafdc698-3758-49cd-bc73-1d4e51eefc46"). InnerVolumeSpecName "kube-api-access-7bqsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.418932 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bqsr\" (UniqueName: \"kubernetes.io/projected/dafdc698-3758-49cd-bc73-1d4e51eefc46-kube-api-access-7bqsr\") on node \"crc\" DevicePath \"\"" Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.730077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562000-rnl52" event={"ID":"dafdc698-3758-49cd-bc73-1d4e51eefc46","Type":"ContainerDied","Data":"fe27aac7a0233989faf1ecbd5b200f0ded285d79bac480e44f965f75b133f3fb"} Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.730134 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe27aac7a0233989faf1ecbd5b200f0ded285d79bac480e44f965f75b133f3fb" Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.730193 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562000-rnl52" Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.839674 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561994-pbvfk"] Mar 17 04:00:26 crc kubenswrapper[4735]: I0317 04:00:26.847815 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561994-pbvfk"] Mar 17 04:00:27 crc kubenswrapper[4735]: I0317 04:00:27.085158 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833bc2e6-f71c-4b83-bf69-d37773c5b08a" path="/var/lib/kubelet/pods/833bc2e6-f71c-4b83-bf69-d37773c5b08a/volumes" Mar 17 04:00:30 crc kubenswrapper[4735]: I0317 04:00:30.073743 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:00:30 crc kubenswrapper[4735]: E0317 04:00:30.074575 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:00:41 crc kubenswrapper[4735]: I0317 04:00:41.076342 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:00:41 crc kubenswrapper[4735]: E0317 04:00:41.077270 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:00:55 crc kubenswrapper[4735]: I0317 04:00:55.081038 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:00:55 crc kubenswrapper[4735]: E0317 04:00:55.082355 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.179280 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29562001-vwd6p"] Mar 17 04:01:00 crc kubenswrapper[4735]: E0317 04:01:00.180523 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fb6e8a-af68-4b6d-bc73-214ad6a09657" containerName="collect-profiles" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.180548 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fb6e8a-af68-4b6d-bc73-214ad6a09657" containerName="collect-profiles" Mar 17 04:01:00 crc kubenswrapper[4735]: E0317 04:01:00.180584 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafdc698-3758-49cd-bc73-1d4e51eefc46" containerName="oc" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.180598 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafdc698-3758-49cd-bc73-1d4e51eefc46" containerName="oc" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.181003 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fb6e8a-af68-4b6d-bc73-214ad6a09657" containerName="collect-profiles" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.181066 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafdc698-3758-49cd-bc73-1d4e51eefc46" containerName="oc" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.182485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.190982 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29562001-vwd6p"] Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.306322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.306367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.306404 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.306589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2ph\" (UniqueName: \"kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.408980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.409137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2ph\" (UniqueName: \"kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.409377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.409422 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.417768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.418508 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.420121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.428341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2ph\" (UniqueName: \"kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph\") pod \"keystone-cron-29562001-vwd6p\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:00 crc kubenswrapper[4735]: I0317 04:01:00.514049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:01 crc kubenswrapper[4735]: W0317 04:01:01.029011 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1115ef04_a8ec_44b6_941d_a540ac1c895b.slice/crio-ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead WatchSource:0}: Error finding container ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead: Status 404 returned error can't find the container with id ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead Mar 17 04:01:01 crc kubenswrapper[4735]: I0317 04:01:01.033462 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29562001-vwd6p"] Mar 17 04:01:01 crc kubenswrapper[4735]: I0317 04:01:01.110443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562001-vwd6p" event={"ID":"1115ef04-a8ec-44b6-941d-a540ac1c895b","Type":"ContainerStarted","Data":"ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead"} Mar 17 04:01:02 crc kubenswrapper[4735]: I0317 04:01:02.123449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562001-vwd6p" event={"ID":"1115ef04-a8ec-44b6-941d-a540ac1c895b","Type":"ContainerStarted","Data":"b820ca0690db5243dbeafda1cdf885c0b8ede92350e4eae2615dc238177d33a0"} Mar 17 04:01:02 crc kubenswrapper[4735]: I0317 04:01:02.151778 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29562001-vwd6p" podStartSLOduration=2.15175655 podStartE2EDuration="2.15175655s" podCreationTimestamp="2026-03-17 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:01:02.141730189 +0000 UTC m=+10287.773963207" watchObservedRunningTime="2026-03-17 04:01:02.15175655 +0000 UTC m=+10287.783989538" Mar 17 04:01:05 crc kubenswrapper[4735]: I0317 04:01:05.154296 4735 generic.go:334] "Generic (PLEG): container finished" podID="1115ef04-a8ec-44b6-941d-a540ac1c895b" containerID="b820ca0690db5243dbeafda1cdf885c0b8ede92350e4eae2615dc238177d33a0" exitCode=0 Mar 17 04:01:05 crc kubenswrapper[4735]: I0317 04:01:05.154444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562001-vwd6p" event={"ID":"1115ef04-a8ec-44b6-941d-a540ac1c895b","Type":"ContainerDied","Data":"b820ca0690db5243dbeafda1cdf885c0b8ede92350e4eae2615dc238177d33a0"} Mar 17 04:01:06 crc kubenswrapper[4735]: I0317 04:01:06.908204 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.059703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data\") pod \"1115ef04-a8ec-44b6-941d-a540ac1c895b\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.059817 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2ph\" (UniqueName: \"kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph\") pod \"1115ef04-a8ec-44b6-941d-a540ac1c895b\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.059839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle\") pod \"1115ef04-a8ec-44b6-941d-a540ac1c895b\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.059883 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys\") pod \"1115ef04-a8ec-44b6-941d-a540ac1c895b\" (UID: \"1115ef04-a8ec-44b6-941d-a540ac1c895b\") " Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.073105 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph" (OuterVolumeSpecName: "kube-api-access-wr2ph") pod "1115ef04-a8ec-44b6-941d-a540ac1c895b" (UID: "1115ef04-a8ec-44b6-941d-a540ac1c895b"). InnerVolumeSpecName "kube-api-access-wr2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.078723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1115ef04-a8ec-44b6-941d-a540ac1c895b" (UID: "1115ef04-a8ec-44b6-941d-a540ac1c895b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.107540 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1115ef04-a8ec-44b6-941d-a540ac1c895b" (UID: "1115ef04-a8ec-44b6-941d-a540ac1c895b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.145546 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data" (OuterVolumeSpecName: "config-data") pod "1115ef04-a8ec-44b6-941d-a540ac1c895b" (UID: "1115ef04-a8ec-44b6-941d-a540ac1c895b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.162568 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.162608 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2ph\" (UniqueName: \"kubernetes.io/projected/1115ef04-a8ec-44b6-941d-a540ac1c895b-kube-api-access-wr2ph\") on node \"crc\" DevicePath \"\"" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.162638 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.162651 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1115ef04-a8ec-44b6-941d-a540ac1c895b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.173709 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562001-vwd6p" Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.225277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562001-vwd6p" event={"ID":"1115ef04-a8ec-44b6-941d-a540ac1c895b","Type":"ContainerDied","Data":"ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead"} Mar 17 04:01:07 crc kubenswrapper[4735]: I0317 04:01:07.225367 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2b625d3b519b745728a90b59e3c4975a61bb6b6e72fda202f2d94006495ead" Mar 17 04:01:09 crc kubenswrapper[4735]: I0317 04:01:09.074220 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:01:09 crc kubenswrapper[4735]: E0317 04:01:09.074808 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:01:10 crc kubenswrapper[4735]: I0317 04:01:10.002022 4735 scope.go:117] "RemoveContainer" containerID="be14bef69cd86649d1967a901ecc94b28d9c9208aa6523ee5d2210fad037b9f9" Mar 17 04:01:21 crc kubenswrapper[4735]: I0317 04:01:21.074015 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:01:21 crc kubenswrapper[4735]: E0317 04:01:21.075085 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:01:36 crc kubenswrapper[4735]: I0317 04:01:36.073549 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:01:36 crc kubenswrapper[4735]: E0317 04:01:36.074389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:01:48 crc kubenswrapper[4735]: I0317 04:01:48.073843 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:01:48 crc kubenswrapper[4735]: I0317 04:01:48.648295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481"} Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.153847 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562002-9m5d4"] Mar 17 04:02:00 crc kubenswrapper[4735]: E0317 04:02:00.154687 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1115ef04-a8ec-44b6-941d-a540ac1c895b" containerName="keystone-cron" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.154699 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1115ef04-a8ec-44b6-941d-a540ac1c895b" containerName="keystone-cron" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.154961 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1115ef04-a8ec-44b6-941d-a540ac1c895b" containerName="keystone-cron" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.155528 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.158137 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.159153 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.160617 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.172220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562002-9m5d4"] Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.290931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drfs\" (UniqueName: \"kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs\") pod \"auto-csr-approver-29562002-9m5d4\" (UID: \"6e521b46-98e8-4ff9-83be-a3deea393417\") " pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.392690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drfs\" (UniqueName: \"kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs\") pod \"auto-csr-approver-29562002-9m5d4\" (UID: \"6e521b46-98e8-4ff9-83be-a3deea393417\") " pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.414701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drfs\" (UniqueName: \"kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs\") pod \"auto-csr-approver-29562002-9m5d4\" (UID: \"6e521b46-98e8-4ff9-83be-a3deea393417\") " pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:00 crc kubenswrapper[4735]: I0317 04:02:00.478470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:01 crc kubenswrapper[4735]: I0317 04:02:01.126501 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562002-9m5d4"] Mar 17 04:02:01 crc kubenswrapper[4735]: I0317 04:02:01.139429 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:02:01 crc kubenswrapper[4735]: I0317 04:02:01.768372 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" event={"ID":"6e521b46-98e8-4ff9-83be-a3deea393417","Type":"ContainerStarted","Data":"6a5ee943e3ae606d8739decea3c56eed8da6a545db6a8d22e656da6d9330cb07"} Mar 17 04:02:03 crc kubenswrapper[4735]: I0317 04:02:03.786230 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" event={"ID":"6e521b46-98e8-4ff9-83be-a3deea393417","Type":"ContainerStarted","Data":"4ba1be995ffe175eab2dbe3b517e7f73210afb3cc7889c2e97b983ad058e199e"} Mar 17 04:02:03 crc kubenswrapper[4735]: I0317 04:02:03.813593 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" podStartSLOduration=1.9786939650000002 podStartE2EDuration="3.81354418s" podCreationTimestamp="2026-03-17 04:02:00 +0000 UTC" firstStartedPulling="2026-03-17 04:02:01.139164722 +0000 UTC m=+10346.771397710" lastFinishedPulling="2026-03-17 04:02:02.974014937 +0000 UTC m=+10348.606247925" observedRunningTime="2026-03-17 04:02:03.807170976 +0000 UTC m=+10349.439403964" watchObservedRunningTime="2026-03-17 04:02:03.81354418 +0000 UTC m=+10349.445777168" Mar 17 04:02:04 crc kubenswrapper[4735]: I0317 04:02:04.802721 4735 generic.go:334] "Generic (PLEG): container finished" podID="6e521b46-98e8-4ff9-83be-a3deea393417" containerID="4ba1be995ffe175eab2dbe3b517e7f73210afb3cc7889c2e97b983ad058e199e" exitCode=0 Mar 17 04:02:04 crc kubenswrapper[4735]: I0317 04:02:04.802776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" event={"ID":"6e521b46-98e8-4ff9-83be-a3deea393417","Type":"ContainerDied","Data":"4ba1be995ffe175eab2dbe3b517e7f73210afb3cc7889c2e97b983ad058e199e"} Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.466164 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.623561 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6drfs\" (UniqueName: \"kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs\") pod \"6e521b46-98e8-4ff9-83be-a3deea393417\" (UID: \"6e521b46-98e8-4ff9-83be-a3deea393417\") " Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.631100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs" (OuterVolumeSpecName: "kube-api-access-6drfs") pod "6e521b46-98e8-4ff9-83be-a3deea393417" (UID: "6e521b46-98e8-4ff9-83be-a3deea393417"). InnerVolumeSpecName "kube-api-access-6drfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.726074 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6drfs\" (UniqueName: \"kubernetes.io/projected/6e521b46-98e8-4ff9-83be-a3deea393417-kube-api-access-6drfs\") on node \"crc\" DevicePath \"\"" Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.825245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" event={"ID":"6e521b46-98e8-4ff9-83be-a3deea393417","Type":"ContainerDied","Data":"6a5ee943e3ae606d8739decea3c56eed8da6a545db6a8d22e656da6d9330cb07"} Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.825526 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5ee943e3ae606d8739decea3c56eed8da6a545db6a8d22e656da6d9330cb07" Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.825336 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562002-9m5d4" Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.896329 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561996-snxch"] Mar 17 04:02:06 crc kubenswrapper[4735]: I0317 04:02:06.904349 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561996-snxch"] Mar 17 04:02:07 crc kubenswrapper[4735]: I0317 04:02:07.095388 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b930ea0a-8da0-437c-9934-a2b9f694d031" path="/var/lib/kubelet/pods/b930ea0a-8da0-437c-9934-a2b9f694d031/volumes" Mar 17 04:02:10 crc kubenswrapper[4735]: I0317 04:02:10.079758 4735 scope.go:117] "RemoveContainer" containerID="521fa6a8a1ece3cf8d78f52a1ac0e3af7dd36fe51356bc5025a56e6730f5bcd6" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.142287 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:03:31 crc kubenswrapper[4735]: E0317 04:03:31.143654 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e521b46-98e8-4ff9-83be-a3deea393417" containerName="oc" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.143678 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e521b46-98e8-4ff9-83be-a3deea393417" containerName="oc" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.143871 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e521b46-98e8-4ff9-83be-a3deea393417" containerName="oc" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.154351 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.162155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.223566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.223635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcshr\" (UniqueName: \"kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.223671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.324982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.325065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcshr\" (UniqueName: \"kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.325098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.325724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.325771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.351560 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcshr\" (UniqueName: \"kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr\") pod \"redhat-operators-d29rl\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:31 crc kubenswrapper[4735]: I0317 04:03:31.537689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:32 crc kubenswrapper[4735]: I0317 04:03:32.500463 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:03:32 crc kubenswrapper[4735]: I0317 04:03:32.971054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerDied","Data":"c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9"} Mar 17 04:03:32 crc kubenswrapper[4735]: I0317 04:03:32.972532 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerID="c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9" exitCode=0 Mar 17 04:03:32 crc kubenswrapper[4735]: I0317 04:03:32.972894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerStarted","Data":"dde3a710d20de6942b7bca0d2577fceeb72076e5f4a83658d40458276cf50620"} Mar 17 04:03:33 crc kubenswrapper[4735]: I0317 04:03:33.981897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerStarted","Data":"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb"} Mar 17 04:03:40 crc kubenswrapper[4735]: I0317 04:03:40.050538 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerID="5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb" exitCode=0 Mar 17 04:03:40 crc kubenswrapper[4735]: I0317 04:03:40.050651 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerDied","Data":"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb"} Mar 17 04:03:42 crc kubenswrapper[4735]: I0317 04:03:42.074534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerStarted","Data":"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7"} Mar 17 04:03:42 crc kubenswrapper[4735]: I0317 04:03:42.099176 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d29rl" podStartSLOduration=3.417412265 podStartE2EDuration="11.095663982s" podCreationTimestamp="2026-03-17 04:03:31 +0000 UTC" firstStartedPulling="2026-03-17 04:03:32.973058926 +0000 UTC m=+10438.605291904" lastFinishedPulling="2026-03-17 04:03:40.651310613 +0000 UTC m=+10446.283543621" observedRunningTime="2026-03-17 04:03:42.089708759 +0000 UTC m=+10447.721941737" watchObservedRunningTime="2026-03-17 04:03:42.095663982 +0000 UTC m=+10447.727896960" Mar 17 04:03:51 crc kubenswrapper[4735]: I0317 04:03:51.538883 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:51 crc kubenswrapper[4735]: I0317 04:03:51.539513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:03:52 crc kubenswrapper[4735]: I0317 04:03:52.607008 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d29rl" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" probeResult="failure" output=< Mar 17 04:03:52 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:03:52 crc kubenswrapper[4735]: > Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.200978 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562004-zpjn7"] Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.203527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.210332 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562004-zpjn7"] Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.213199 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.213189 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.213219 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.390624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkltd\" (UniqueName: \"kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd\") pod \"auto-csr-approver-29562004-zpjn7\" (UID: \"06e96988-fe37-4e97-b955-6ddb502821dd\") " pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.493203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkltd\" (UniqueName: \"kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd\") pod \"auto-csr-approver-29562004-zpjn7\" (UID: \"06e96988-fe37-4e97-b955-6ddb502821dd\") " pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.540632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkltd\" (UniqueName: \"kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd\") pod \"auto-csr-approver-29562004-zpjn7\" (UID: \"06e96988-fe37-4e97-b955-6ddb502821dd\") " pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:00 crc kubenswrapper[4735]: I0317 04:04:00.559385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:01 crc kubenswrapper[4735]: I0317 04:04:01.544494 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562004-zpjn7"] Mar 17 04:04:01 crc kubenswrapper[4735]: W0317 04:04:01.561081 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e96988_fe37_4e97_b955_6ddb502821dd.slice/crio-e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8 WatchSource:0}: Error finding container e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8: Status 404 returned error can't find the container with id e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8 Mar 17 04:04:02 crc kubenswrapper[4735]: I0317 04:04:02.244491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" event={"ID":"06e96988-fe37-4e97-b955-6ddb502821dd","Type":"ContainerStarted","Data":"e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8"} Mar 17 04:04:02 crc kubenswrapper[4735]: I0317 04:04:02.605254 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d29rl" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" probeResult="failure" output=< Mar 17 04:04:02 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:04:02 crc kubenswrapper[4735]: > Mar 17 04:04:04 crc kubenswrapper[4735]: I0317 04:04:04.267317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" event={"ID":"06e96988-fe37-4e97-b955-6ddb502821dd","Type":"ContainerStarted","Data":"ea82863211494c1f095ea93911117c23089bac5a0105c6ceac894df7d42bfe95"} Mar 17 04:04:04 crc kubenswrapper[4735]: I0317 04:04:04.291323 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" podStartSLOduration=3.172623763 podStartE2EDuration="4.291297569s" podCreationTimestamp="2026-03-17 04:04:00 +0000 UTC" firstStartedPulling="2026-03-17 04:04:01.561921146 +0000 UTC m=+10467.194154124" lastFinishedPulling="2026-03-17 04:04:02.680594952 +0000 UTC m=+10468.312827930" observedRunningTime="2026-03-17 04:04:04.280459708 +0000 UTC m=+10469.912692686" watchObservedRunningTime="2026-03-17 04:04:04.291297569 +0000 UTC m=+10469.923530547" Mar 17 04:04:05 crc kubenswrapper[4735]: I0317 04:04:05.275462 4735 generic.go:334] "Generic (PLEG): container finished" podID="06e96988-fe37-4e97-b955-6ddb502821dd" containerID="ea82863211494c1f095ea93911117c23089bac5a0105c6ceac894df7d42bfe95" exitCode=0 Mar 17 04:04:05 crc kubenswrapper[4735]: I0317 04:04:05.275565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" event={"ID":"06e96988-fe37-4e97-b955-6ddb502821dd","Type":"ContainerDied","Data":"ea82863211494c1f095ea93911117c23089bac5a0105c6ceac894df7d42bfe95"} Mar 17 04:04:06 crc kubenswrapper[4735]: I0317 04:04:06.757041 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:06 crc kubenswrapper[4735]: I0317 04:04:06.932426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkltd\" (UniqueName: \"kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd\") pod \"06e96988-fe37-4e97-b955-6ddb502821dd\" (UID: \"06e96988-fe37-4e97-b955-6ddb502821dd\") " Mar 17 04:04:06 crc kubenswrapper[4735]: I0317 04:04:06.946058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd" (OuterVolumeSpecName: "kube-api-access-pkltd") pod "06e96988-fe37-4e97-b955-6ddb502821dd" (UID: "06e96988-fe37-4e97-b955-6ddb502821dd"). InnerVolumeSpecName "kube-api-access-pkltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.035022 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkltd\" (UniqueName: \"kubernetes.io/projected/06e96988-fe37-4e97-b955-6ddb502821dd-kube-api-access-pkltd\") on node \"crc\" DevicePath \"\"" Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.291250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" event={"ID":"06e96988-fe37-4e97-b955-6ddb502821dd","Type":"ContainerDied","Data":"e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8"} Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.291292 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5cd3c9422d36f458e058600c200f5bd4d91cad1dd160301208098334e1ff6e8" Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.291341 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562004-zpjn7" Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.380091 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561998-jkmdf"] Mar 17 04:04:07 crc kubenswrapper[4735]: I0317 04:04:07.387657 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561998-jkmdf"] Mar 17 04:04:09 crc kubenswrapper[4735]: I0317 04:04:09.097756 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd4e003-fc98-4302-b4a0-f3f25c7fb9a6" path="/var/lib/kubelet/pods/acd4e003-fc98-4302-b4a0-f3f25c7fb9a6/volumes" Mar 17 04:04:10 crc kubenswrapper[4735]: I0317 04:04:10.211182 4735 scope.go:117] "RemoveContainer" containerID="5853a9621103209c6166d1286ee3b1c4a553137f7c11cdffe27f80ebd9c9064a" Mar 17 04:04:12 crc kubenswrapper[4735]: I0317 04:04:12.597547 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d29rl" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" probeResult="failure" output=< Mar 17 04:04:12 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:04:12 crc kubenswrapper[4735]: > Mar 17 04:04:12 crc kubenswrapper[4735]: I0317 04:04:12.606083 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:04:12 crc kubenswrapper[4735]: I0317 04:04:12.606146 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:04:22 crc kubenswrapper[4735]: I0317 04:04:22.608537 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d29rl" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" probeResult="failure" output=< Mar 17 04:04:22 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:04:22 crc kubenswrapper[4735]: > Mar 17 04:04:31 crc kubenswrapper[4735]: I0317 04:04:31.614334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:04:31 crc kubenswrapper[4735]: I0317 04:04:31.685635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:04:32 crc kubenswrapper[4735]: I0317 04:04:32.359122 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:04:33 crc kubenswrapper[4735]: I0317 04:04:33.538709 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d29rl" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" containerID="cri-o://24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7" gracePeriod=2 Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.322079 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.404232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcshr\" (UniqueName: \"kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr\") pod \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.405179 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content\") pod \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.405469 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities\") pod \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\" (UID: \"d0b2f118-1ed6-4400-838e-4c33b3ef52ba\") " Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.405939 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities" (OuterVolumeSpecName: "utilities") pod "d0b2f118-1ed6-4400-838e-4c33b3ef52ba" (UID: "d0b2f118-1ed6-4400-838e-4c33b3ef52ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.406888 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.444790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr" (OuterVolumeSpecName: "kube-api-access-lcshr") pod "d0b2f118-1ed6-4400-838e-4c33b3ef52ba" (UID: "d0b2f118-1ed6-4400-838e-4c33b3ef52ba"). InnerVolumeSpecName "kube-api-access-lcshr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.509160 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcshr\" (UniqueName: \"kubernetes.io/projected/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-kube-api-access-lcshr\") on node \"crc\" DevicePath \"\"" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.545284 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerID="24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7" exitCode=0 Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.545322 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerDied","Data":"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7"} Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.545346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d29rl" event={"ID":"d0b2f118-1ed6-4400-838e-4c33b3ef52ba","Type":"ContainerDied","Data":"dde3a710d20de6942b7bca0d2577fceeb72076e5f4a83658d40458276cf50620"} Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.545365 4735 scope.go:117] "RemoveContainer" containerID="24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.545474 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d29rl" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.570054 4735 scope.go:117] "RemoveContainer" containerID="5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.597322 4735 scope.go:117] "RemoveContainer" containerID="c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.619688 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0b2f118-1ed6-4400-838e-4c33b3ef52ba" (UID: "d0b2f118-1ed6-4400-838e-4c33b3ef52ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.638041 4735 scope.go:117] "RemoveContainer" containerID="24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7" Mar 17 04:04:34 crc kubenswrapper[4735]: E0317 04:04:34.643413 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7\": container with ID starting with 24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7 not found: ID does not exist" containerID="24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.643489 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7"} err="failed to get container status \"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7\": rpc error: code = NotFound desc = could not find container \"24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7\": container with ID starting with 24467273e4529a8660bb412db65e59651c598276e1285ff35fac99aff8fcfac7 not found: ID does not exist" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.643517 4735 scope.go:117] "RemoveContainer" containerID="5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb" Mar 17 04:04:34 crc kubenswrapper[4735]: E0317 04:04:34.643977 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb\": container with ID starting with 5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb not found: ID does not exist" containerID="5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.644179 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb"} err="failed to get container status \"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb\": rpc error: code = NotFound desc = could not find container \"5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb\": container with ID starting with 5a5ef18a02851ab508770650da33dc51faf9bf2fc9396b3c5e4db0651d5b57bb not found: ID does not exist" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.644259 4735 scope.go:117] "RemoveContainer" containerID="c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9" Mar 17 04:04:34 crc kubenswrapper[4735]: E0317 04:04:34.644565 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9\": container with ID starting with c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9 not found: ID does not exist" containerID="c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.644596 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9"} err="failed to get container status \"c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9\": rpc error: code = NotFound desc = could not find container \"c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9\": container with ID starting with c8abbe10ba8cab9f1996301fea5fe810f1422f2f9a21f19e4b5583a5f2e96df9 not found: ID does not exist" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.712687 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b2f118-1ed6-4400-838e-4c33b3ef52ba-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.880145 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:04:34 crc kubenswrapper[4735]: I0317 04:04:34.892516 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d29rl"] Mar 17 04:04:35 crc kubenswrapper[4735]: I0317 04:04:35.095515 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" path="/var/lib/kubelet/pods/d0b2f118-1ed6-4400-838e-4c33b3ef52ba/volumes" Mar 17 04:04:42 crc kubenswrapper[4735]: I0317 04:04:42.606988 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:04:42 crc kubenswrapper[4735]: I0317 04:04:42.607482 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.606505 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.607211 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.607260 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.608086 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.608133 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481" gracePeriod=600 Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.990970 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481" exitCode=0 Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.991070 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481"} Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.991329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0"} Mar 17 04:05:12 crc kubenswrapper[4735]: I0317 04:05:12.991351 4735 scope.go:117] "RemoveContainer" containerID="32ec1b972505a10607b093bd3b0f57a0959052b85559ce8b6d05e5e71bc70487" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.761110 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:05:50 crc kubenswrapper[4735]: E0317 04:05:50.762901 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="extract-content" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.762934 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="extract-content" Mar 17 04:05:50 crc kubenswrapper[4735]: E0317 04:05:50.762965 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e96988-fe37-4e97-b955-6ddb502821dd" containerName="oc" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.762997 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e96988-fe37-4e97-b955-6ddb502821dd" containerName="oc" Mar 17 04:05:50 crc kubenswrapper[4735]: E0317 04:05:50.763019 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.763028 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" Mar 17 04:05:50 crc kubenswrapper[4735]: E0317 04:05:50.763062 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="extract-utilities" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.763075 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="extract-utilities" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.764336 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e96988-fe37-4e97-b955-6ddb502821dd" containerName="oc" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.764393 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b2f118-1ed6-4400-838e-4c33b3ef52ba" containerName="registry-server" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.770110 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.795616 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.896165 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.896244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzlz\" (UniqueName: \"kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.896493 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.997940 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.998011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzlz\" (UniqueName: \"kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.998084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.998501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:50 crc kubenswrapper[4735]: I0317 04:05:50.998540 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:51 crc kubenswrapper[4735]: I0317 04:05:51.025787 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzlz\" (UniqueName: \"kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz\") pod \"community-operators-5g6wn\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:51 crc kubenswrapper[4735]: I0317 04:05:51.097958 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:05:51 crc kubenswrapper[4735]: I0317 04:05:51.646641 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:05:51 crc kubenswrapper[4735]: W0317 04:05:51.649353 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90f7ae8e_e54c_480b_8f55_080d89b1da90.slice/crio-acffd28605a611dd37678dc783bfdc4bcf84d1235a3c509055d93deca44b9daf WatchSource:0}: Error finding container acffd28605a611dd37678dc783bfdc4bcf84d1235a3c509055d93deca44b9daf: Status 404 returned error can't find the container with id acffd28605a611dd37678dc783bfdc4bcf84d1235a3c509055d93deca44b9daf Mar 17 04:05:52 crc kubenswrapper[4735]: I0317 04:05:52.394610 4735 generic.go:334] "Generic (PLEG): container finished" podID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerID="7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a" exitCode=0 Mar 17 04:05:52 crc kubenswrapper[4735]: I0317 04:05:52.394879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerDied","Data":"7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a"} Mar 17 04:05:52 crc kubenswrapper[4735]: I0317 04:05:52.394902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerStarted","Data":"acffd28605a611dd37678dc783bfdc4bcf84d1235a3c509055d93deca44b9daf"} Mar 17 04:05:53 crc kubenswrapper[4735]: I0317 04:05:53.405224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerStarted","Data":"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010"} Mar 17 04:05:55 crc kubenswrapper[4735]: I0317 04:05:55.425884 4735 generic.go:334] "Generic (PLEG): container finished" podID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerID="275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010" exitCode=0 Mar 17 04:05:55 crc kubenswrapper[4735]: I0317 04:05:55.425977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerDied","Data":"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010"} Mar 17 04:05:57 crc kubenswrapper[4735]: I0317 04:05:57.444354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerStarted","Data":"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca"} Mar 17 04:05:57 crc kubenswrapper[4735]: I0317 04:05:57.472235 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5g6wn" podStartSLOduration=3.750946764 podStartE2EDuration="7.472220298s" podCreationTimestamp="2026-03-17 04:05:50 +0000 UTC" firstStartedPulling="2026-03-17 04:05:52.40024266 +0000 UTC m=+10578.032475638" lastFinishedPulling="2026-03-17 04:05:56.121516194 +0000 UTC m=+10581.753749172" observedRunningTime="2026-03-17 04:05:57.465389353 +0000 UTC m=+10583.097622331" watchObservedRunningTime="2026-03-17 04:05:57.472220298 +0000 UTC m=+10583.104453276" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.160499 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562006-79rjz"] Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.162214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.172304 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.172305 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.172656 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.184659 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562006-79rjz"] Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.287352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxqq\" (UniqueName: \"kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq\") pod \"auto-csr-approver-29562006-79rjz\" (UID: \"153caacf-e2c5-4bcc-9df8-cf0302aff746\") " pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.388934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxqq\" (UniqueName: \"kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq\") pod \"auto-csr-approver-29562006-79rjz\" (UID: \"153caacf-e2c5-4bcc-9df8-cf0302aff746\") " pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.413617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxqq\" (UniqueName: \"kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq\") pod \"auto-csr-approver-29562006-79rjz\" (UID: \"153caacf-e2c5-4bcc-9df8-cf0302aff746\") " pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:00 crc kubenswrapper[4735]: I0317 04:06:00.481498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:01 crc kubenswrapper[4735]: I0317 04:06:01.098393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:01 crc kubenswrapper[4735]: I0317 04:06:01.098739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:01 crc kubenswrapper[4735]: I0317 04:06:01.199976 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562006-79rjz"] Mar 17 04:06:01 crc kubenswrapper[4735]: I0317 04:06:01.473967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562006-79rjz" event={"ID":"153caacf-e2c5-4bcc-9df8-cf0302aff746","Type":"ContainerStarted","Data":"5943833dd22f1cc7c619a9f2226d4c64efb6c684a1576f1485d3e1bad3c2def1"} Mar 17 04:06:02 crc kubenswrapper[4735]: I0317 04:06:02.154295 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5g6wn" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="registry-server" probeResult="failure" output=< Mar 17 04:06:02 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:06:02 crc kubenswrapper[4735]: > Mar 17 04:06:03 crc kubenswrapper[4735]: I0317 04:06:03.516839 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562006-79rjz" event={"ID":"153caacf-e2c5-4bcc-9df8-cf0302aff746","Type":"ContainerStarted","Data":"41182b53a83fc6fff77ce10dfadc68a842da06daa24856305548007cf32e1ed6"} Mar 17 04:06:04 crc kubenswrapper[4735]: I0317 04:06:04.528160 4735 generic.go:334] "Generic (PLEG): container finished" podID="153caacf-e2c5-4bcc-9df8-cf0302aff746" containerID="41182b53a83fc6fff77ce10dfadc68a842da06daa24856305548007cf32e1ed6" exitCode=0 Mar 17 04:06:04 crc kubenswrapper[4735]: I0317 04:06:04.528225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562006-79rjz" event={"ID":"153caacf-e2c5-4bcc-9df8-cf0302aff746","Type":"ContainerDied","Data":"41182b53a83fc6fff77ce10dfadc68a842da06daa24856305548007cf32e1ed6"} Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.178482 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.327457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkxqq\" (UniqueName: \"kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq\") pod \"153caacf-e2c5-4bcc-9df8-cf0302aff746\" (UID: \"153caacf-e2c5-4bcc-9df8-cf0302aff746\") " Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.353699 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq" (OuterVolumeSpecName: "kube-api-access-hkxqq") pod "153caacf-e2c5-4bcc-9df8-cf0302aff746" (UID: "153caacf-e2c5-4bcc-9df8-cf0302aff746"). InnerVolumeSpecName "kube-api-access-hkxqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.429484 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkxqq\" (UniqueName: \"kubernetes.io/projected/153caacf-e2c5-4bcc-9df8-cf0302aff746-kube-api-access-hkxqq\") on node \"crc\" DevicePath \"\"" Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.551443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562006-79rjz" event={"ID":"153caacf-e2c5-4bcc-9df8-cf0302aff746","Type":"ContainerDied","Data":"5943833dd22f1cc7c619a9f2226d4c64efb6c684a1576f1485d3e1bad3c2def1"} Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.551478 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5943833dd22f1cc7c619a9f2226d4c64efb6c684a1576f1485d3e1bad3c2def1" Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.551522 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562006-79rjz" Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.629781 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562000-rnl52"] Mar 17 04:06:06 crc kubenswrapper[4735]: I0317 04:06:06.636571 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562000-rnl52"] Mar 17 04:06:07 crc kubenswrapper[4735]: I0317 04:06:07.087343 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafdc698-3758-49cd-bc73-1d4e51eefc46" path="/var/lib/kubelet/pods/dafdc698-3758-49cd-bc73-1d4e51eefc46/volumes" Mar 17 04:06:11 crc kubenswrapper[4735]: I0317 04:06:11.169583 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:11 crc kubenswrapper[4735]: I0317 04:06:11.267643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:11 crc kubenswrapper[4735]: I0317 04:06:11.418019 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:06:12 crc kubenswrapper[4735]: I0317 04:06:12.610918 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5g6wn" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="registry-server" containerID="cri-o://9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca" gracePeriod=2 Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.216024 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.276541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwzlz\" (UniqueName: \"kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz\") pod \"90f7ae8e-e54c-480b-8f55-080d89b1da90\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.276583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content\") pod \"90f7ae8e-e54c-480b-8f55-080d89b1da90\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.276739 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities\") pod \"90f7ae8e-e54c-480b-8f55-080d89b1da90\" (UID: \"90f7ae8e-e54c-480b-8f55-080d89b1da90\") " Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.277485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities" (OuterVolumeSpecName: "utilities") pod "90f7ae8e-e54c-480b-8f55-080d89b1da90" (UID: "90f7ae8e-e54c-480b-8f55-080d89b1da90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.290391 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz" (OuterVolumeSpecName: "kube-api-access-mwzlz") pod "90f7ae8e-e54c-480b-8f55-080d89b1da90" (UID: "90f7ae8e-e54c-480b-8f55-080d89b1da90"). InnerVolumeSpecName "kube-api-access-mwzlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.343880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90f7ae8e-e54c-480b-8f55-080d89b1da90" (UID: "90f7ae8e-e54c-480b-8f55-080d89b1da90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.378584 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.378625 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90f7ae8e-e54c-480b-8f55-080d89b1da90-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.378638 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwzlz\" (UniqueName: \"kubernetes.io/projected/90f7ae8e-e54c-480b-8f55-080d89b1da90-kube-api-access-mwzlz\") on node \"crc\" DevicePath \"\"" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.623418 4735 generic.go:334] "Generic (PLEG): container finished" podID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerID="9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca" exitCode=0 Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.623457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerDied","Data":"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca"} Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.623487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g6wn" event={"ID":"90f7ae8e-e54c-480b-8f55-080d89b1da90","Type":"ContainerDied","Data":"acffd28605a611dd37678dc783bfdc4bcf84d1235a3c509055d93deca44b9daf"} Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.623517 4735 scope.go:117] "RemoveContainer" containerID="9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.623536 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g6wn" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.650020 4735 scope.go:117] "RemoveContainer" containerID="275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.678035 4735 scope.go:117] "RemoveContainer" containerID="7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.680956 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.693736 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5g6wn"] Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.742095 4735 scope.go:117] "RemoveContainer" containerID="9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca" Mar 17 04:06:13 crc kubenswrapper[4735]: E0317 04:06:13.742938 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca\": container with ID starting with 9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca not found: ID does not exist" containerID="9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.742980 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca"} err="failed to get container status \"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca\": rpc error: code = NotFound desc = could not find container \"9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca\": container with ID starting with 9ba264311fbeeda6ad7f9c68bc406ccd22afa670c18ff7a193e9af099903d0ca not found: ID does not exist" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.743005 4735 scope.go:117] "RemoveContainer" containerID="275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010" Mar 17 04:06:13 crc kubenswrapper[4735]: E0317 04:06:13.743365 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010\": container with ID starting with 275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010 not found: ID does not exist" containerID="275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.743407 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010"} err="failed to get container status \"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010\": rpc error: code = NotFound desc = could not find container \"275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010\": container with ID starting with 275c52b5256a5e1d22f96a51d11062efe130294a98a4b5ace7c213a4d4890010 not found: ID does not exist" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.743453 4735 scope.go:117] "RemoveContainer" containerID="7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a" Mar 17 04:06:13 crc kubenswrapper[4735]: E0317 04:06:13.744125 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a\": container with ID starting with 7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a not found: ID does not exist" containerID="7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a" Mar 17 04:06:13 crc kubenswrapper[4735]: I0317 04:06:13.744164 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a"} err="failed to get container status \"7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a\": rpc error: code = NotFound desc = could not find container \"7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a\": container with ID starting with 7cf23b1710421d1db211bf2b39e4abce3e51fe046e978756c5563c69f18e351a not found: ID does not exist" Mar 17 04:06:15 crc kubenswrapper[4735]: I0317 04:06:15.089660 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" path="/var/lib/kubelet/pods/90f7ae8e-e54c-480b-8f55-080d89b1da90/volumes" Mar 17 04:07:10 crc kubenswrapper[4735]: I0317 04:07:10.529574 4735 scope.go:117] "RemoveContainer" containerID="0e4e887af998f927634dd02f414bbc01dfe30de53ced95d31ac6fff5ecc57f9d" Mar 17 04:07:12 crc kubenswrapper[4735]: I0317 04:07:12.606264 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:07:12 crc kubenswrapper[4735]: I0317 04:07:12.606791 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:07:42 crc kubenswrapper[4735]: I0317 04:07:42.612252 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:07:42 crc kubenswrapper[4735]: I0317 04:07:42.613072 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.904843 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:07:48 crc kubenswrapper[4735]: E0317 04:07:48.905704 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153caacf-e2c5-4bcc-9df8-cf0302aff746" containerName="oc" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905728 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="153caacf-e2c5-4bcc-9df8-cf0302aff746" containerName="oc" Mar 17 04:07:48 crc kubenswrapper[4735]: E0317 04:07:48.905744 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="extract-utilities" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905750 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="extract-utilities" Mar 17 04:07:48 crc kubenswrapper[4735]: E0317 04:07:48.905762 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="extract-content" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905768 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="extract-content" Mar 17 04:07:48 crc kubenswrapper[4735]: E0317 04:07:48.905779 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="registry-server" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905785 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="registry-server" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905974 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="153caacf-e2c5-4bcc-9df8-cf0302aff746" containerName="oc" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.905990 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f7ae8e-e54c-480b-8f55-080d89b1da90" containerName="registry-server" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.907235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:48 crc kubenswrapper[4735]: I0317 04:07:48.942898 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.028083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.028562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wd5\" (UniqueName: \"kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.028785 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.131267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wd5\" (UniqueName: \"kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.131325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.131358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.131751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.131914 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.151447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wd5\" (UniqueName: \"kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5\") pod \"redhat-marketplace-mlvcw\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.224130 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:49 crc kubenswrapper[4735]: I0317 04:07:49.714161 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:07:50 crc kubenswrapper[4735]: I0317 04:07:50.705770 4735 generic.go:334] "Generic (PLEG): container finished" podID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerID="8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc" exitCode=0 Mar 17 04:07:50 crc kubenswrapper[4735]: I0317 04:07:50.705978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerDied","Data":"8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc"} Mar 17 04:07:50 crc kubenswrapper[4735]: I0317 04:07:50.706064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerStarted","Data":"eb52ee64ad806c94bbe8402f57c3700bd99b60a9c5dba79ff3062baee231f3fc"} Mar 17 04:07:50 crc kubenswrapper[4735]: I0317 04:07:50.708643 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:07:52 crc kubenswrapper[4735]: I0317 04:07:52.725551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerStarted","Data":"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6"} Mar 17 04:07:53 crc kubenswrapper[4735]: I0317 04:07:53.737394 4735 generic.go:334] "Generic (PLEG): container finished" podID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerID="859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6" exitCode=0 Mar 17 04:07:53 crc kubenswrapper[4735]: I0317 04:07:53.737511 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerDied","Data":"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6"} Mar 17 04:07:54 crc kubenswrapper[4735]: I0317 04:07:54.750156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerStarted","Data":"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe"} Mar 17 04:07:59 crc kubenswrapper[4735]: I0317 04:07:59.225026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:07:59 crc kubenswrapper[4735]: I0317 04:07:59.225452 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.151215 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mlvcw" podStartSLOduration=8.737740811 podStartE2EDuration="12.151196274s" podCreationTimestamp="2026-03-17 04:07:48 +0000 UTC" firstStartedPulling="2026-03-17 04:07:50.707488157 +0000 UTC m=+10696.339721135" lastFinishedPulling="2026-03-17 04:07:54.12094358 +0000 UTC m=+10699.753176598" observedRunningTime="2026-03-17 04:07:54.776837547 +0000 UTC m=+10700.409070535" watchObservedRunningTime="2026-03-17 04:08:00.151196274 +0000 UTC m=+10705.783429262" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.161640 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562008-c4hv2"] Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.165089 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.169114 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.169253 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.169131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.179199 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562008-c4hv2"] Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.285203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdrw\" (UniqueName: \"kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw\") pod \"auto-csr-approver-29562008-c4hv2\" (UID: \"f42baa5c-2877-42f2-b046-758a9c21f76a\") " pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.294275 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mlvcw" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="registry-server" probeResult="failure" output=< Mar 17 04:08:00 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:08:00 crc kubenswrapper[4735]: > Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.387203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdrw\" (UniqueName: \"kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw\") pod \"auto-csr-approver-29562008-c4hv2\" (UID: \"f42baa5c-2877-42f2-b046-758a9c21f76a\") " pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.414021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdrw\" (UniqueName: \"kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw\") pod \"auto-csr-approver-29562008-c4hv2\" (UID: \"f42baa5c-2877-42f2-b046-758a9c21f76a\") " pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.488626 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:00 crc kubenswrapper[4735]: I0317 04:08:00.901670 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562008-c4hv2"] Mar 17 04:08:00 crc kubenswrapper[4735]: W0317 04:08:00.905172 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf42baa5c_2877_42f2_b046_758a9c21f76a.slice/crio-7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0 WatchSource:0}: Error finding container 7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0: Status 404 returned error can't find the container with id 7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0 Mar 17 04:08:01 crc kubenswrapper[4735]: I0317 04:08:01.820144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" event={"ID":"f42baa5c-2877-42f2-b046-758a9c21f76a","Type":"ContainerStarted","Data":"7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0"} Mar 17 04:08:02 crc kubenswrapper[4735]: I0317 04:08:02.829204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" event={"ID":"f42baa5c-2877-42f2-b046-758a9c21f76a","Type":"ContainerStarted","Data":"16443899c7d33f849ce10a8e20f0ea7524e51aa5a022369a2332f70e735ab31a"} Mar 17 04:08:02 crc kubenswrapper[4735]: I0317 04:08:02.847247 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" podStartSLOduration=1.788024574 podStartE2EDuration="2.847231088s" podCreationTimestamp="2026-03-17 04:08:00 +0000 UTC" firstStartedPulling="2026-03-17 04:08:00.908669065 +0000 UTC m=+10706.540902043" lastFinishedPulling="2026-03-17 04:08:01.967875569 +0000 UTC m=+10707.600108557" observedRunningTime="2026-03-17 04:08:02.843684341 +0000 UTC m=+10708.475917319" watchObservedRunningTime="2026-03-17 04:08:02.847231088 +0000 UTC m=+10708.479464066" Mar 17 04:08:03 crc kubenswrapper[4735]: I0317 04:08:03.841160 4735 generic.go:334] "Generic (PLEG): container finished" podID="f42baa5c-2877-42f2-b046-758a9c21f76a" containerID="16443899c7d33f849ce10a8e20f0ea7524e51aa5a022369a2332f70e735ab31a" exitCode=0 Mar 17 04:08:03 crc kubenswrapper[4735]: I0317 04:08:03.841232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" event={"ID":"f42baa5c-2877-42f2-b046-758a9c21f76a","Type":"ContainerDied","Data":"16443899c7d33f849ce10a8e20f0ea7524e51aa5a022369a2332f70e735ab31a"} Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.329214 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.508378 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mdrw\" (UniqueName: \"kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw\") pod \"f42baa5c-2877-42f2-b046-758a9c21f76a\" (UID: \"f42baa5c-2877-42f2-b046-758a9c21f76a\") " Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.520480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw" (OuterVolumeSpecName: "kube-api-access-2mdrw") pod "f42baa5c-2877-42f2-b046-758a9c21f76a" (UID: "f42baa5c-2877-42f2-b046-758a9c21f76a"). InnerVolumeSpecName "kube-api-access-2mdrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.610218 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mdrw\" (UniqueName: \"kubernetes.io/projected/f42baa5c-2877-42f2-b046-758a9c21f76a-kube-api-access-2mdrw\") on node \"crc\" DevicePath \"\"" Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.879508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" event={"ID":"f42baa5c-2877-42f2-b046-758a9c21f76a","Type":"ContainerDied","Data":"7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0"} Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.879719 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9dd63ec16485a19e5930941c0f12a02e5c55dec17553d2b70d5e316822c0e0" Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.879607 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562008-c4hv2" Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.944467 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562002-9m5d4"] Mar 17 04:08:05 crc kubenswrapper[4735]: I0317 04:08:05.953595 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562002-9m5d4"] Mar 17 04:08:07 crc kubenswrapper[4735]: I0317 04:08:07.083638 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e521b46-98e8-4ff9-83be-a3deea393417" path="/var/lib/kubelet/pods/6e521b46-98e8-4ff9-83be-a3deea393417/volumes" Mar 17 04:08:09 crc kubenswrapper[4735]: I0317 04:08:09.313283 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:08:09 crc kubenswrapper[4735]: I0317 04:08:09.387570 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:08:09 crc kubenswrapper[4735]: I0317 04:08:09.558526 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:08:10 crc kubenswrapper[4735]: I0317 04:08:10.675993 4735 scope.go:117] "RemoveContainer" containerID="4ba1be995ffe175eab2dbe3b517e7f73210afb3cc7889c2e97b983ad058e199e" Mar 17 04:08:10 crc kubenswrapper[4735]: I0317 04:08:10.939377 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mlvcw" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="registry-server" containerID="cri-o://36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe" gracePeriod=2 Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.487308 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.583577 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6wd5\" (UniqueName: \"kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5\") pod \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.583659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content\") pod \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.583719 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities\") pod \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\" (UID: \"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc\") " Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.585637 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities" (OuterVolumeSpecName: "utilities") pod "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" (UID: "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.593444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5" (OuterVolumeSpecName: "kube-api-access-m6wd5") pod "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" (UID: "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc"). InnerVolumeSpecName "kube-api-access-m6wd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.625287 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" (UID: "83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.686543 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6wd5\" (UniqueName: \"kubernetes.io/projected/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-kube-api-access-m6wd5\") on node \"crc\" DevicePath \"\"" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.686580 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.686592 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.958949 4735 generic.go:334] "Generic (PLEG): container finished" podID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerID="36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe" exitCode=0 Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.959019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerDied","Data":"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe"} Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.959062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlvcw" event={"ID":"83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc","Type":"ContainerDied","Data":"eb52ee64ad806c94bbe8402f57c3700bd99b60a9c5dba79ff3062baee231f3fc"} Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.959077 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlvcw" Mar 17 04:08:11 crc kubenswrapper[4735]: I0317 04:08:11.959092 4735 scope.go:117] "RemoveContainer" containerID="36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.006244 4735 scope.go:117] "RemoveContainer" containerID="859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.044443 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.054984 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlvcw"] Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.063960 4735 scope.go:117] "RemoveContainer" containerID="8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.114075 4735 scope.go:117] "RemoveContainer" containerID="36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe" Mar 17 04:08:12 crc kubenswrapper[4735]: E0317 04:08:12.114597 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe\": container with ID starting with 36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe not found: ID does not exist" containerID="36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.114627 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe"} err="failed to get container status \"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe\": rpc error: code = NotFound desc = could not find container \"36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe\": container with ID starting with 36e269164663faafccf6248f18776000b550f93fddecdcef03c704b05783c0fe not found: ID does not exist" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.114647 4735 scope.go:117] "RemoveContainer" containerID="859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6" Mar 17 04:08:12 crc kubenswrapper[4735]: E0317 04:08:12.115392 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6\": container with ID starting with 859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6 not found: ID does not exist" containerID="859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.115410 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6"} err="failed to get container status \"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6\": rpc error: code = NotFound desc = could not find container \"859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6\": container with ID starting with 859b77f4f5edfbabe579a5730bfd2493ead6c3aae17cce511a0c93f0376296f6 not found: ID does not exist" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.115529 4735 scope.go:117] "RemoveContainer" containerID="8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc" Mar 17 04:08:12 crc kubenswrapper[4735]: E0317 04:08:12.115941 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc\": container with ID starting with 8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc not found: ID does not exist" containerID="8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.116131 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc"} err="failed to get container status \"8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc\": rpc error: code = NotFound desc = could not find container \"8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc\": container with ID starting with 8d2d49797f2d9a0817926a3de7c464af5b13ceb9307d433f585419ff0e2ac6bc not found: ID does not exist" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.606149 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.606213 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.606250 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.606821 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.606916 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" gracePeriod=600 Mar 17 04:08:12 crc kubenswrapper[4735]: E0317 04:08:12.731072 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.971829 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" exitCode=0 Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.971911 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0"} Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.972120 4735 scope.go:117] "RemoveContainer" containerID="85e11aa11f60619705958d96d59d48081f51e908ddadda871bda8e3609aa4481" Mar 17 04:08:12 crc kubenswrapper[4735]: I0317 04:08:12.973052 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:08:12 crc kubenswrapper[4735]: E0317 04:08:12.973607 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:08:13 crc kubenswrapper[4735]: I0317 04:08:13.115579 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" path="/var/lib/kubelet/pods/83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc/volumes" Mar 17 04:08:24 crc kubenswrapper[4735]: I0317 04:08:24.073648 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:08:24 crc kubenswrapper[4735]: E0317 04:08:24.074513 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:08:36 crc kubenswrapper[4735]: I0317 04:08:36.074355 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:08:36 crc kubenswrapper[4735]: E0317 04:08:36.074977 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:08:50 crc kubenswrapper[4735]: I0317 04:08:50.072660 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:08:50 crc kubenswrapper[4735]: E0317 04:08:50.073308 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:09:04 crc kubenswrapper[4735]: I0317 04:09:04.073311 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:09:04 crc kubenswrapper[4735]: E0317 04:09:04.074394 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:09:15 crc kubenswrapper[4735]: I0317 04:09:15.088352 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:09:15 crc kubenswrapper[4735]: E0317 04:09:15.089439 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:09:28 crc kubenswrapper[4735]: I0317 04:09:28.073750 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:09:28 crc kubenswrapper[4735]: E0317 04:09:28.074922 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:09:40 crc kubenswrapper[4735]: I0317 04:09:40.085638 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:09:40 crc kubenswrapper[4735]: E0317 04:09:40.086733 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:09:52 crc kubenswrapper[4735]: I0317 04:09:52.072896 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:09:52 crc kubenswrapper[4735]: E0317 04:09:52.073587 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.172839 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562010-wl5k5"] Mar 17 04:10:00 crc kubenswrapper[4735]: E0317 04:10:00.173778 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="registry-server" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.173793 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="registry-server" Mar 17 04:10:00 crc kubenswrapper[4735]: E0317 04:10:00.173818 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42baa5c-2877-42f2-b046-758a9c21f76a" containerName="oc" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.173826 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42baa5c-2877-42f2-b046-758a9c21f76a" containerName="oc" Mar 17 04:10:00 crc kubenswrapper[4735]: E0317 04:10:00.173843 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="extract-content" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.173851 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="extract-content" Mar 17 04:10:00 crc kubenswrapper[4735]: E0317 04:10:00.173901 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="extract-utilities" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.173909 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="extract-utilities" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.174122 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e39e4e-eeaf-4b12-94f3-b8b0778b5cfc" containerName="registry-server" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.174149 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42baa5c-2877-42f2-b046-758a9c21f76a" containerName="oc" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.175625 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.178437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgzw\" (UniqueName: \"kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw\") pod \"auto-csr-approver-29562010-wl5k5\" (UID: \"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f\") " pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.179176 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.179979 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.181491 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.232934 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562010-wl5k5"] Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.281474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgzw\" (UniqueName: \"kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw\") pod \"auto-csr-approver-29562010-wl5k5\" (UID: \"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f\") " pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.300538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgzw\" (UniqueName: \"kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw\") pod \"auto-csr-approver-29562010-wl5k5\" (UID: \"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f\") " pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:00 crc kubenswrapper[4735]: I0317 04:10:00.502697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:01 crc kubenswrapper[4735]: I0317 04:10:01.111216 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562010-wl5k5"] Mar 17 04:10:02 crc kubenswrapper[4735]: I0317 04:10:02.089010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" event={"ID":"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f","Type":"ContainerStarted","Data":"80523a4f3507d708ed1ac38ab714447f1a4d18341f5de601763424ea735f61c6"} Mar 17 04:10:03 crc kubenswrapper[4735]: I0317 04:10:03.073988 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:10:03 crc kubenswrapper[4735]: E0317 04:10:03.075124 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:10:03 crc kubenswrapper[4735]: I0317 04:10:03.103468 4735 generic.go:334] "Generic (PLEG): container finished" podID="30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" containerID="de60566819d9a2f61de25ebfbd6ac3027af1f879ec578609e999fdb1afe6abbc" exitCode=0 Mar 17 04:10:03 crc kubenswrapper[4735]: I0317 04:10:03.103678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" event={"ID":"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f","Type":"ContainerDied","Data":"de60566819d9a2f61de25ebfbd6ac3027af1f879ec578609e999fdb1afe6abbc"} Mar 17 04:10:04 crc kubenswrapper[4735]: I0317 04:10:04.551684 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:04 crc kubenswrapper[4735]: I0317 04:10:04.684825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xgzw\" (UniqueName: \"kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw\") pod \"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f\" (UID: \"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f\") " Mar 17 04:10:04 crc kubenswrapper[4735]: I0317 04:10:04.691218 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw" (OuterVolumeSpecName: "kube-api-access-9xgzw") pod "30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" (UID: "30efd3bc-9591-4870-9fa1-40ca3c8fcd5f"). InnerVolumeSpecName "kube-api-access-9xgzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:10:04 crc kubenswrapper[4735]: I0317 04:10:04.787254 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xgzw\" (UniqueName: \"kubernetes.io/projected/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f-kube-api-access-9xgzw\") on node \"crc\" DevicePath \"\"" Mar 17 04:10:05 crc kubenswrapper[4735]: E0317 04:10:05.109387 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30efd3bc_9591_4870_9fa1_40ca3c8fcd5f.slice\": RecentStats: unable to find data in memory cache]" Mar 17 04:10:05 crc kubenswrapper[4735]: I0317 04:10:05.130435 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" event={"ID":"30efd3bc-9591-4870-9fa1-40ca3c8fcd5f","Type":"ContainerDied","Data":"80523a4f3507d708ed1ac38ab714447f1a4d18341f5de601763424ea735f61c6"} Mar 17 04:10:05 crc kubenswrapper[4735]: I0317 04:10:05.130688 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80523a4f3507d708ed1ac38ab714447f1a4d18341f5de601763424ea735f61c6" Mar 17 04:10:05 crc kubenswrapper[4735]: I0317 04:10:05.130560 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562010-wl5k5" Mar 17 04:10:05 crc kubenswrapper[4735]: I0317 04:10:05.642316 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562004-zpjn7"] Mar 17 04:10:05 crc kubenswrapper[4735]: I0317 04:10:05.652882 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562004-zpjn7"] Mar 17 04:10:07 crc kubenswrapper[4735]: I0317 04:10:07.093818 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e96988-fe37-4e97-b955-6ddb502821dd" path="/var/lib/kubelet/pods/06e96988-fe37-4e97-b955-6ddb502821dd/volumes" Mar 17 04:10:10 crc kubenswrapper[4735]: I0317 04:10:10.853854 4735 scope.go:117] "RemoveContainer" containerID="ea82863211494c1f095ea93911117c23089bac5a0105c6ceac894df7d42bfe95" Mar 17 04:10:15 crc kubenswrapper[4735]: I0317 04:10:15.075106 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:10:15 crc kubenswrapper[4735]: E0317 04:10:15.076304 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:10:28 crc kubenswrapper[4735]: I0317 04:10:28.073122 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:10:28 crc kubenswrapper[4735]: E0317 04:10:28.073989 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:10:43 crc kubenswrapper[4735]: I0317 04:10:43.074136 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:10:43 crc kubenswrapper[4735]: E0317 04:10:43.075611 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:10:54 crc kubenswrapper[4735]: I0317 04:10:54.073882 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:10:54 crc kubenswrapper[4735]: E0317 04:10:54.074664 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:11:06 crc kubenswrapper[4735]: I0317 04:11:06.073382 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:11:06 crc kubenswrapper[4735]: E0317 04:11:06.075827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:11:20 crc kubenswrapper[4735]: I0317 04:11:20.073351 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:11:20 crc kubenswrapper[4735]: E0317 04:11:20.074905 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:11:31 crc kubenswrapper[4735]: I0317 04:11:31.073088 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:11:31 crc kubenswrapper[4735]: E0317 04:11:31.073997 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:11:43 crc kubenswrapper[4735]: I0317 04:11:43.073985 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:11:43 crc kubenswrapper[4735]: E0317 04:11:43.075239 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:11:56 crc kubenswrapper[4735]: I0317 04:11:56.074677 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:11:56 crc kubenswrapper[4735]: E0317 04:11:56.076443 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.172565 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562012-z8ljs"] Mar 17 04:12:00 crc kubenswrapper[4735]: E0317 04:12:00.173535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" containerName="oc" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.173551 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" containerName="oc" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.173790 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" containerName="oc" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.174570 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.177764 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.178159 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.179548 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.197446 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562012-z8ljs"] Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.367048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95k8\" (UniqueName: \"kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8\") pod \"auto-csr-approver-29562012-z8ljs\" (UID: \"0e5e7bfc-e01c-4827-8094-221777002da4\") " pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.468742 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95k8\" (UniqueName: \"kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8\") pod \"auto-csr-approver-29562012-z8ljs\" (UID: \"0e5e7bfc-e01c-4827-8094-221777002da4\") " pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.487812 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95k8\" (UniqueName: \"kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8\") pod \"auto-csr-approver-29562012-z8ljs\" (UID: \"0e5e7bfc-e01c-4827-8094-221777002da4\") " pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.513354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:00 crc kubenswrapper[4735]: I0317 04:12:00.988926 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562012-z8ljs"] Mar 17 04:12:01 crc kubenswrapper[4735]: I0317 04:12:01.339932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" event={"ID":"0e5e7bfc-e01c-4827-8094-221777002da4","Type":"ContainerStarted","Data":"140cbf1335598cef2af113731b5508348cf02b0b91aae8aedc97e76ffe0ac3e2"} Mar 17 04:12:03 crc kubenswrapper[4735]: I0317 04:12:03.358000 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e5e7bfc-e01c-4827-8094-221777002da4" containerID="f8d82bba27bd4a14b1f539ee09e9396c601c28be31395049991a254cd547bd62" exitCode=0 Mar 17 04:12:03 crc kubenswrapper[4735]: I0317 04:12:03.358060 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" event={"ID":"0e5e7bfc-e01c-4827-8094-221777002da4","Type":"ContainerDied","Data":"f8d82bba27bd4a14b1f539ee09e9396c601c28be31395049991a254cd547bd62"} Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:04.933482 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.077366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95k8\" (UniqueName: \"kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8\") pod \"0e5e7bfc-e01c-4827-8094-221777002da4\" (UID: \"0e5e7bfc-e01c-4827-8094-221777002da4\") " Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.085280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8" (OuterVolumeSpecName: "kube-api-access-l95k8") pod "0e5e7bfc-e01c-4827-8094-221777002da4" (UID: "0e5e7bfc-e01c-4827-8094-221777002da4"). InnerVolumeSpecName "kube-api-access-l95k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.180360 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l95k8\" (UniqueName: \"kubernetes.io/projected/0e5e7bfc-e01c-4827-8094-221777002da4-kube-api-access-l95k8\") on node \"crc\" DevicePath \"\"" Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.392927 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.392850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562012-z8ljs" event={"ID":"0e5e7bfc-e01c-4827-8094-221777002da4","Type":"ContainerDied","Data":"140cbf1335598cef2af113731b5508348cf02b0b91aae8aedc97e76ffe0ac3e2"} Mar 17 04:12:05 crc kubenswrapper[4735]: I0317 04:12:05.393546 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140cbf1335598cef2af113731b5508348cf02b0b91aae8aedc97e76ffe0ac3e2" Mar 17 04:12:06 crc kubenswrapper[4735]: I0317 04:12:06.035901 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562006-79rjz"] Mar 17 04:12:06 crc kubenswrapper[4735]: I0317 04:12:06.048388 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562006-79rjz"] Mar 17 04:12:07 crc kubenswrapper[4735]: I0317 04:12:07.088908 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153caacf-e2c5-4bcc-9df8-cf0302aff746" path="/var/lib/kubelet/pods/153caacf-e2c5-4bcc-9df8-cf0302aff746/volumes" Mar 17 04:12:10 crc kubenswrapper[4735]: I0317 04:12:10.073770 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:12:10 crc kubenswrapper[4735]: E0317 04:12:10.074561 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:12:10 crc kubenswrapper[4735]: I0317 04:12:10.973565 4735 scope.go:117] "RemoveContainer" containerID="41182b53a83fc6fff77ce10dfadc68a842da06daa24856305548007cf32e1ed6" Mar 17 04:12:25 crc kubenswrapper[4735]: I0317 04:12:25.081986 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:12:25 crc kubenswrapper[4735]: E0317 04:12:25.083048 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:12:40 crc kubenswrapper[4735]: I0317 04:12:40.073398 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:12:40 crc kubenswrapper[4735]: E0317 04:12:40.074683 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:12:51 crc kubenswrapper[4735]: I0317 04:12:51.073442 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:12:51 crc kubenswrapper[4735]: E0317 04:12:51.074277 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:13:05 crc kubenswrapper[4735]: I0317 04:13:05.088052 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:13:05 crc kubenswrapper[4735]: E0317 04:13:05.089108 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.011629 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:15 crc kubenswrapper[4735]: E0317 04:13:15.012772 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5e7bfc-e01c-4827-8094-221777002da4" containerName="oc" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.012787 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5e7bfc-e01c-4827-8094-221777002da4" containerName="oc" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.013373 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5e7bfc-e01c-4827-8094-221777002da4" containerName="oc" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.024668 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.025342 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.141842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zsm\" (UniqueName: \"kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.142415 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.142555 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.244331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zsm\" (UniqueName: \"kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.244567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.245028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.245570 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.245797 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.274592 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zsm\" (UniqueName: \"kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm\") pod \"certified-operators-8c46j\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.352561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:15 crc kubenswrapper[4735]: I0317 04:13:15.838800 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.076320 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.387432 4735 generic.go:334] "Generic (PLEG): container finished" podID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerID="91df0b3d3c027e484033c695039b54d997d41b5ad5d18c94edb4dc31d674649b" exitCode=0 Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.387536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerDied","Data":"91df0b3d3c027e484033c695039b54d997d41b5ad5d18c94edb4dc31d674649b"} Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.387691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerStarted","Data":"41d152d465ea470e8ae3e6f7d53062b14ed13f1e55a679a75729ce2b97b0d4ad"} Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.389273 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:13:17 crc kubenswrapper[4735]: I0317 04:13:17.390471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5"} Mar 17 04:13:19 crc kubenswrapper[4735]: I0317 04:13:19.409808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerStarted","Data":"472206c76d97967ea2cf06b287e4afdbd6881fc7b91fa8f7e4d24c371be53b22"} Mar 17 04:13:21 crc kubenswrapper[4735]: I0317 04:13:21.427387 4735 generic.go:334] "Generic (PLEG): container finished" podID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerID="472206c76d97967ea2cf06b287e4afdbd6881fc7b91fa8f7e4d24c371be53b22" exitCode=0 Mar 17 04:13:21 crc kubenswrapper[4735]: I0317 04:13:21.428274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerDied","Data":"472206c76d97967ea2cf06b287e4afdbd6881fc7b91fa8f7e4d24c371be53b22"} Mar 17 04:13:22 crc kubenswrapper[4735]: I0317 04:13:22.451342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerStarted","Data":"3b72c86461464a9054c479717ba7354223b0a60085de57084a550b6857d74c67"} Mar 17 04:13:22 crc kubenswrapper[4735]: I0317 04:13:22.475425 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8c46j" podStartSLOduration=4.044409426 podStartE2EDuration="8.475405611s" podCreationTimestamp="2026-03-17 04:13:14 +0000 UTC" firstStartedPulling="2026-03-17 04:13:17.388993821 +0000 UTC m=+11023.021226819" lastFinishedPulling="2026-03-17 04:13:21.819989986 +0000 UTC m=+11027.452223004" observedRunningTime="2026-03-17 04:13:22.473531275 +0000 UTC m=+11028.105764253" watchObservedRunningTime="2026-03-17 04:13:22.475405611 +0000 UTC m=+11028.107638599" Mar 17 04:13:25 crc kubenswrapper[4735]: I0317 04:13:25.353652 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:25 crc kubenswrapper[4735]: I0317 04:13:25.354288 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:26 crc kubenswrapper[4735]: I0317 04:13:26.410645 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8c46j" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="registry-server" probeResult="failure" output=< Mar 17 04:13:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:13:26 crc kubenswrapper[4735]: > Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.463432 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.562155 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.573583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.588944 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.666668 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.712636 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhld\" (UniqueName: \"kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.712707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.712739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.814768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhld\" (UniqueName: \"kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.814852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.814969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.816011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.816078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.836821 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhld\" (UniqueName: \"kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld\") pod \"redhat-operators-wc2g9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:35 crc kubenswrapper[4735]: I0317 04:13:35.890297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:36 crc kubenswrapper[4735]: I0317 04:13:36.760324 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:13:36 crc kubenswrapper[4735]: W0317 04:13:36.777783 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3590c4d6_a7dc_4151_a140_f733b10011d9.slice/crio-118a688f789f3560620023dcf60da825a7a2387e28fbbec3012d1793e7dcc4f5 WatchSource:0}: Error finding container 118a688f789f3560620023dcf60da825a7a2387e28fbbec3012d1793e7dcc4f5: Status 404 returned error can't find the container with id 118a688f789f3560620023dcf60da825a7a2387e28fbbec3012d1793e7dcc4f5 Mar 17 04:13:37 crc kubenswrapper[4735]: I0317 04:13:37.585469 4735 generic.go:334] "Generic (PLEG): container finished" podID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerID="422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792" exitCode=0 Mar 17 04:13:37 crc kubenswrapper[4735]: I0317 04:13:37.585715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerDied","Data":"422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792"} Mar 17 04:13:37 crc kubenswrapper[4735]: I0317 04:13:37.585980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerStarted","Data":"118a688f789f3560620023dcf60da825a7a2387e28fbbec3012d1793e7dcc4f5"} Mar 17 04:13:37 crc kubenswrapper[4735]: I0317 04:13:37.949633 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:37 crc kubenswrapper[4735]: I0317 04:13:37.949932 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8c46j" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="registry-server" containerID="cri-o://3b72c86461464a9054c479717ba7354223b0a60085de57084a550b6857d74c67" gracePeriod=2 Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.621387 4735 generic.go:334] "Generic (PLEG): container finished" podID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerID="3b72c86461464a9054c479717ba7354223b0a60085de57084a550b6857d74c67" exitCode=0 Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.621657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerDied","Data":"3b72c86461464a9054c479717ba7354223b0a60085de57084a550b6857d74c67"} Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.801720 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.990712 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9zsm\" (UniqueName: \"kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm\") pod \"122e90e7-50f6-45c3-a0b3-e81261d07464\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.990925 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities\") pod \"122e90e7-50f6-45c3-a0b3-e81261d07464\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.991027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content\") pod \"122e90e7-50f6-45c3-a0b3-e81261d07464\" (UID: \"122e90e7-50f6-45c3-a0b3-e81261d07464\") " Mar 17 04:13:38 crc kubenswrapper[4735]: I0317 04:13:38.992093 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities" (OuterVolumeSpecName: "utilities") pod "122e90e7-50f6-45c3-a0b3-e81261d07464" (UID: "122e90e7-50f6-45c3-a0b3-e81261d07464"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.011386 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm" (OuterVolumeSpecName: "kube-api-access-m9zsm") pod "122e90e7-50f6-45c3-a0b3-e81261d07464" (UID: "122e90e7-50f6-45c3-a0b3-e81261d07464"). InnerVolumeSpecName "kube-api-access-m9zsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.060455 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "122e90e7-50f6-45c3-a0b3-e81261d07464" (UID: "122e90e7-50f6-45c3-a0b3-e81261d07464"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.092878 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.092903 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e90e7-50f6-45c3-a0b3-e81261d07464-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.092914 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9zsm\" (UniqueName: \"kubernetes.io/projected/122e90e7-50f6-45c3-a0b3-e81261d07464-kube-api-access-m9zsm\") on node \"crc\" DevicePath \"\"" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.631016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerStarted","Data":"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140"} Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.633284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c46j" event={"ID":"122e90e7-50f6-45c3-a0b3-e81261d07464","Type":"ContainerDied","Data":"41d152d465ea470e8ae3e6f7d53062b14ed13f1e55a679a75729ce2b97b0d4ad"} Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.633344 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c46j" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.633355 4735 scope.go:117] "RemoveContainer" containerID="3b72c86461464a9054c479717ba7354223b0a60085de57084a550b6857d74c67" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.652296 4735 scope.go:117] "RemoveContainer" containerID="472206c76d97967ea2cf06b287e4afdbd6881fc7b91fa8f7e4d24c371be53b22" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.689122 4735 scope.go:117] "RemoveContainer" containerID="91df0b3d3c027e484033c695039b54d997d41b5ad5d18c94edb4dc31d674649b" Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.700034 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:39 crc kubenswrapper[4735]: I0317 04:13:39.711584 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8c46j"] Mar 17 04:13:41 crc kubenswrapper[4735]: I0317 04:13:41.085672 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" path="/var/lib/kubelet/pods/122e90e7-50f6-45c3-a0b3-e81261d07464/volumes" Mar 17 04:13:44 crc kubenswrapper[4735]: I0317 04:13:44.684644 4735 generic.go:334] "Generic (PLEG): container finished" podID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerID="57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140" exitCode=0 Mar 17 04:13:44 crc kubenswrapper[4735]: I0317 04:13:44.684731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerDied","Data":"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140"} Mar 17 04:13:45 crc kubenswrapper[4735]: I0317 04:13:45.695014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerStarted","Data":"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767"} Mar 17 04:13:45 crc kubenswrapper[4735]: I0317 04:13:45.718142 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wc2g9" podStartSLOduration=3.190191015 podStartE2EDuration="10.718122791s" podCreationTimestamp="2026-03-17 04:13:35 +0000 UTC" firstStartedPulling="2026-03-17 04:13:37.588264838 +0000 UTC m=+11043.220497816" lastFinishedPulling="2026-03-17 04:13:45.116196614 +0000 UTC m=+11050.748429592" observedRunningTime="2026-03-17 04:13:45.716635725 +0000 UTC m=+11051.348868703" watchObservedRunningTime="2026-03-17 04:13:45.718122791 +0000 UTC m=+11051.350355769" Mar 17 04:13:45 crc kubenswrapper[4735]: I0317 04:13:45.890737 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:45 crc kubenswrapper[4735]: I0317 04:13:45.890791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:13:46 crc kubenswrapper[4735]: I0317 04:13:46.938540 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" probeResult="failure" output=< Mar 17 04:13:46 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:13:46 crc kubenswrapper[4735]: > Mar 17 04:13:56 crc kubenswrapper[4735]: I0317 04:13:56.939595 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" probeResult="failure" output=< Mar 17 04:13:56 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:13:56 crc kubenswrapper[4735]: > Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.260084 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562014-vn2lc"] Mar 17 04:14:00 crc kubenswrapper[4735]: E0317 04:14:00.265671 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="extract-content" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.266162 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="extract-content" Mar 17 04:14:00 crc kubenswrapper[4735]: E0317 04:14:00.266229 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="registry-server" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.266237 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="registry-server" Mar 17 04:14:00 crc kubenswrapper[4735]: E0317 04:14:00.266251 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="extract-utilities" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.266258 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="extract-utilities" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.266537 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="122e90e7-50f6-45c3-a0b3-e81261d07464" containerName="registry-server" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.267168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.268743 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562014-vn2lc"] Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.278737 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.278750 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.284341 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.321788 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtvdm\" (UniqueName: \"kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm\") pod \"auto-csr-approver-29562014-vn2lc\" (UID: \"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49\") " pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.423443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtvdm\" (UniqueName: \"kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm\") pod \"auto-csr-approver-29562014-vn2lc\" (UID: \"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49\") " pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.443482 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtvdm\" (UniqueName: \"kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm\") pod \"auto-csr-approver-29562014-vn2lc\" (UID: \"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49\") " pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:00 crc kubenswrapper[4735]: I0317 04:14:00.587133 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:01 crc kubenswrapper[4735]: I0317 04:14:01.126597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562014-vn2lc"] Mar 17 04:14:01 crc kubenswrapper[4735]: I0317 04:14:01.852807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" event={"ID":"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49","Type":"ContainerStarted","Data":"9447f45d5c4c83f1c86a5dd3ed8efe2ffaf82e3b63425dba5341a8e827cd6fb3"} Mar 17 04:14:03 crc kubenswrapper[4735]: I0317 04:14:03.872302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" event={"ID":"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49","Type":"ContainerStarted","Data":"5cf13667c753028bca250ef6d6d7a5b6781aad7cce08f994766d6dfa58863fe2"} Mar 17 04:14:03 crc kubenswrapper[4735]: I0317 04:14:03.890760 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" podStartSLOduration=2.9566956319999997 podStartE2EDuration="3.889624481s" podCreationTimestamp="2026-03-17 04:14:00 +0000 UTC" firstStartedPulling="2026-03-17 04:14:01.228261429 +0000 UTC m=+11066.860494407" lastFinishedPulling="2026-03-17 04:14:02.161190278 +0000 UTC m=+11067.793423256" observedRunningTime="2026-03-17 04:14:03.886817803 +0000 UTC m=+11069.519050781" watchObservedRunningTime="2026-03-17 04:14:03.889624481 +0000 UTC m=+11069.521857459" Mar 17 04:14:04 crc kubenswrapper[4735]: I0317 04:14:04.881545 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" containerID="5cf13667c753028bca250ef6d6d7a5b6781aad7cce08f994766d6dfa58863fe2" exitCode=0 Mar 17 04:14:04 crc kubenswrapper[4735]: I0317 04:14:04.881584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" event={"ID":"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49","Type":"ContainerDied","Data":"5cf13667c753028bca250ef6d6d7a5b6781aad7cce08f994766d6dfa58863fe2"} Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.557207 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.679487 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtvdm\" (UniqueName: \"kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm\") pod \"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49\" (UID: \"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49\") " Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.705166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm" (OuterVolumeSpecName: "kube-api-access-rtvdm") pod "1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" (UID: "1cabd609-66ca-4a0c-a84b-0f0d05aa6d49"). InnerVolumeSpecName "kube-api-access-rtvdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.781544 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtvdm\" (UniqueName: \"kubernetes.io/projected/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49-kube-api-access-rtvdm\") on node \"crc\" DevicePath \"\"" Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.897557 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" event={"ID":"1cabd609-66ca-4a0c-a84b-0f0d05aa6d49","Type":"ContainerDied","Data":"9447f45d5c4c83f1c86a5dd3ed8efe2ffaf82e3b63425dba5341a8e827cd6fb3"} Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.897607 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562014-vn2lc" Mar 17 04:14:06 crc kubenswrapper[4735]: I0317 04:14:06.898452 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9447f45d5c4c83f1c86a5dd3ed8efe2ffaf82e3b63425dba5341a8e827cd6fb3" Mar 17 04:14:07 crc kubenswrapper[4735]: I0317 04:14:07.003282 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562008-c4hv2"] Mar 17 04:14:07 crc kubenswrapper[4735]: I0317 04:14:07.012729 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562008-c4hv2"] Mar 17 04:14:07 crc kubenswrapper[4735]: I0317 04:14:07.040273 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" probeResult="failure" output=< Mar 17 04:14:07 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:14:07 crc kubenswrapper[4735]: > Mar 17 04:14:07 crc kubenswrapper[4735]: I0317 04:14:07.083419 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42baa5c-2877-42f2-b046-758a9c21f76a" path="/var/lib/kubelet/pods/f42baa5c-2877-42f2-b046-758a9c21f76a/volumes" Mar 17 04:14:11 crc kubenswrapper[4735]: I0317 04:14:11.072146 4735 scope.go:117] "RemoveContainer" containerID="16443899c7d33f849ce10a8e20f0ea7524e51aa5a022369a2332f70e735ab31a" Mar 17 04:14:16 crc kubenswrapper[4735]: I0317 04:14:16.940913 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" probeResult="failure" output=< Mar 17 04:14:16 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:14:16 crc kubenswrapper[4735]: > Mar 17 04:14:26 crc kubenswrapper[4735]: I0317 04:14:26.950055 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" probeResult="failure" output=< Mar 17 04:14:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:14:26 crc kubenswrapper[4735]: > Mar 17 04:14:36 crc kubenswrapper[4735]: I0317 04:14:36.296818 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:14:36 crc kubenswrapper[4735]: I0317 04:14:36.361338 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:14:36 crc kubenswrapper[4735]: I0317 04:14:36.797128 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:14:38 crc kubenswrapper[4735]: I0317 04:14:38.215164 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wc2g9" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" containerID="cri-o://2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767" gracePeriod=2 Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.089761 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.206668 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbhld\" (UniqueName: \"kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld\") pod \"3590c4d6-a7dc-4151-a140-f733b10011d9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.207156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities\") pod \"3590c4d6-a7dc-4151-a140-f733b10011d9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.207334 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content\") pod \"3590c4d6-a7dc-4151-a140-f733b10011d9\" (UID: \"3590c4d6-a7dc-4151-a140-f733b10011d9\") " Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.210121 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities" (OuterVolumeSpecName: "utilities") pod "3590c4d6-a7dc-4151-a140-f733b10011d9" (UID: "3590c4d6-a7dc-4151-a140-f733b10011d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.224135 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld" (OuterVolumeSpecName: "kube-api-access-sbhld") pod "3590c4d6-a7dc-4151-a140-f733b10011d9" (UID: "3590c4d6-a7dc-4151-a140-f733b10011d9"). InnerVolumeSpecName "kube-api-access-sbhld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.234606 4735 generic.go:334] "Generic (PLEG): container finished" podID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerID="2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767" exitCode=0 Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.234655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerDied","Data":"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767"} Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.234687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc2g9" event={"ID":"3590c4d6-a7dc-4151-a140-f733b10011d9","Type":"ContainerDied","Data":"118a688f789f3560620023dcf60da825a7a2387e28fbbec3012d1793e7dcc4f5"} Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.234755 4735 scope.go:117] "RemoveContainer" containerID="2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.235790 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc2g9" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.295261 4735 scope.go:117] "RemoveContainer" containerID="57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.310303 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbhld\" (UniqueName: \"kubernetes.io/projected/3590c4d6-a7dc-4151-a140-f733b10011d9-kube-api-access-sbhld\") on node \"crc\" DevicePath \"\"" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.310345 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.316332 4735 scope.go:117] "RemoveContainer" containerID="422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.366669 4735 scope.go:117] "RemoveContainer" containerID="2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.371159 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3590c4d6-a7dc-4151-a140-f733b10011d9" (UID: "3590c4d6-a7dc-4151-a140-f733b10011d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:14:39 crc kubenswrapper[4735]: E0317 04:14:39.375126 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767\": container with ID starting with 2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767 not found: ID does not exist" containerID="2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.375194 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767"} err="failed to get container status \"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767\": rpc error: code = NotFound desc = could not find container \"2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767\": container with ID starting with 2658510ce2541b490640830a070b53fa472f217caa20b2aacaeea5847e163767 not found: ID does not exist" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.375236 4735 scope.go:117] "RemoveContainer" containerID="57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140" Mar 17 04:14:39 crc kubenswrapper[4735]: E0317 04:14:39.385743 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140\": container with ID starting with 57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140 not found: ID does not exist" containerID="57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.385768 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140"} err="failed to get container status \"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140\": rpc error: code = NotFound desc = could not find container \"57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140\": container with ID starting with 57dd85201edfb024207d906aa5fccd388a3aa1e3e2586416b1dd3d84c5c40140 not found: ID does not exist" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.385797 4735 scope.go:117] "RemoveContainer" containerID="422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792" Mar 17 04:14:39 crc kubenswrapper[4735]: E0317 04:14:39.386368 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792\": container with ID starting with 422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792 not found: ID does not exist" containerID="422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.386460 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792"} err="failed to get container status \"422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792\": rpc error: code = NotFound desc = could not find container \"422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792\": container with ID starting with 422e2b3a56b361b9c5245fe45ed49352aeece2d2497fd5d1f17dd2417110a792 not found: ID does not exist" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.412446 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3590c4d6-a7dc-4151-a140-f733b10011d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.568107 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:14:39 crc kubenswrapper[4735]: I0317 04:14:39.575105 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wc2g9"] Mar 17 04:14:41 crc kubenswrapper[4735]: I0317 04:14:41.089772 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" path="/var/lib/kubelet/pods/3590c4d6-a7dc-4151-a140-f733b10011d9/volumes" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.198509 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5"] Mar 17 04:15:00 crc kubenswrapper[4735]: E0317 04:15:00.201291 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="extract-content" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.201755 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="extract-content" Mar 17 04:15:00 crc kubenswrapper[4735]: E0317 04:15:00.201785 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" containerName="oc" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.201794 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" containerName="oc" Mar 17 04:15:00 crc kubenswrapper[4735]: E0317 04:15:00.201815 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.201823 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" Mar 17 04:15:00 crc kubenswrapper[4735]: E0317 04:15:00.201847 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="extract-utilities" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.201872 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="extract-utilities" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.202430 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" containerName="oc" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.202474 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3590c4d6-a7dc-4151-a140-f733b10011d9" containerName="registry-server" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.205652 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.217012 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.217017 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.234639 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5"] Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.249840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.250175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.250304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjzs\" (UniqueName: \"kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.352221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.352296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjzs\" (UniqueName: \"kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.352399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.353466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.359530 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.375549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjzs\" (UniqueName: \"kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs\") pod \"collect-profiles-29562015-q8tv5\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:00 crc kubenswrapper[4735]: I0317 04:15:00.536899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:01 crc kubenswrapper[4735]: I0317 04:15:01.098058 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5"] Mar 17 04:15:01 crc kubenswrapper[4735]: I0317 04:15:01.509472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" event={"ID":"e559ea72-a821-4965-931b-86b9ee42b426","Type":"ContainerStarted","Data":"846d469ea0d8fbd20e852eff03162f5571fbee79848504c00c806cf7f1546bb1"} Mar 17 04:15:01 crc kubenswrapper[4735]: I0317 04:15:01.510227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" event={"ID":"e559ea72-a821-4965-931b-86b9ee42b426","Type":"ContainerStarted","Data":"12dcf2fb2313466fe567350c94af26575197ccf8e45ac602e60b6168679d3dc5"} Mar 17 04:15:01 crc kubenswrapper[4735]: I0317 04:15:01.533040 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" podStartSLOduration=1.5330246490000001 podStartE2EDuration="1.533024649s" podCreationTimestamp="2026-03-17 04:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:15:01.530979639 +0000 UTC m=+11127.163212627" watchObservedRunningTime="2026-03-17 04:15:01.533024649 +0000 UTC m=+11127.165257627" Mar 17 04:15:02 crc kubenswrapper[4735]: I0317 04:15:02.517850 4735 generic.go:334] "Generic (PLEG): container finished" podID="e559ea72-a821-4965-931b-86b9ee42b426" containerID="846d469ea0d8fbd20e852eff03162f5571fbee79848504c00c806cf7f1546bb1" exitCode=0 Mar 17 04:15:02 crc kubenswrapper[4735]: I0317 04:15:02.517966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" event={"ID":"e559ea72-a821-4965-931b-86b9ee42b426","Type":"ContainerDied","Data":"846d469ea0d8fbd20e852eff03162f5571fbee79848504c00c806cf7f1546bb1"} Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.005370 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.119578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume\") pod \"e559ea72-a821-4965-931b-86b9ee42b426\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.119691 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume\") pod \"e559ea72-a821-4965-931b-86b9ee42b426\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.119997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjzs\" (UniqueName: \"kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs\") pod \"e559ea72-a821-4965-931b-86b9ee42b426\" (UID: \"e559ea72-a821-4965-931b-86b9ee42b426\") " Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.120339 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume" (OuterVolumeSpecName: "config-volume") pod "e559ea72-a821-4965-931b-86b9ee42b426" (UID: "e559ea72-a821-4965-931b-86b9ee42b426"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.120449 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e559ea72-a821-4965-931b-86b9ee42b426-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.128049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs" (OuterVolumeSpecName: "kube-api-access-qhjzs") pod "e559ea72-a821-4965-931b-86b9ee42b426" (UID: "e559ea72-a821-4965-931b-86b9ee42b426"). InnerVolumeSpecName "kube-api-access-qhjzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.129970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e559ea72-a821-4965-931b-86b9ee42b426" (UID: "e559ea72-a821-4965-931b-86b9ee42b426"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.222357 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e559ea72-a821-4965-931b-86b9ee42b426-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.222608 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhjzs\" (UniqueName: \"kubernetes.io/projected/e559ea72-a821-4965-931b-86b9ee42b426-kube-api-access-qhjzs\") on node \"crc\" DevicePath \"\"" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.537154 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" event={"ID":"e559ea72-a821-4965-931b-86b9ee42b426","Type":"ContainerDied","Data":"12dcf2fb2313466fe567350c94af26575197ccf8e45ac602e60b6168679d3dc5"} Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.537187 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562015-q8tv5" Mar 17 04:15:04 crc kubenswrapper[4735]: I0317 04:15:04.537207 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12dcf2fb2313466fe567350c94af26575197ccf8e45ac602e60b6168679d3dc5" Mar 17 04:15:05 crc kubenswrapper[4735]: I0317 04:15:05.094318 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8"] Mar 17 04:15:05 crc kubenswrapper[4735]: I0317 04:15:05.095253 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561970-fgtm8"] Mar 17 04:15:07 crc kubenswrapper[4735]: I0317 04:15:07.084749 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c13609-dc71-4248-992c-b90968adf138" path="/var/lib/kubelet/pods/f3c13609-dc71-4248-992c-b90968adf138/volumes" Mar 17 04:15:11 crc kubenswrapper[4735]: I0317 04:15:11.477799 4735 scope.go:117] "RemoveContainer" containerID="8ed62c91c46c798c6db90079cf7137ad5de0fca61caaadc658cff30ac9f25036" Mar 17 04:15:42 crc kubenswrapper[4735]: I0317 04:15:42.606092 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:15:42 crc kubenswrapper[4735]: I0317 04:15:42.608998 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.156641 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562016-q5k82"] Mar 17 04:16:00 crc kubenswrapper[4735]: E0317 04:16:00.157629 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e559ea72-a821-4965-931b-86b9ee42b426" containerName="collect-profiles" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.157644 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e559ea72-a821-4965-931b-86b9ee42b426" containerName="collect-profiles" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.157912 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e559ea72-a821-4965-931b-86b9ee42b426" containerName="collect-profiles" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.158610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.164897 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.164993 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.165150 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.172435 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562016-q5k82"] Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.332965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89pk\" (UniqueName: \"kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk\") pod \"auto-csr-approver-29562016-q5k82\" (UID: \"57164bf3-2763-422b-94be-2c1590adc27d\") " pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.435407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89pk\" (UniqueName: \"kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk\") pod \"auto-csr-approver-29562016-q5k82\" (UID: \"57164bf3-2763-422b-94be-2c1590adc27d\") " pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.463283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89pk\" (UniqueName: \"kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk\") pod \"auto-csr-approver-29562016-q5k82\" (UID: \"57164bf3-2763-422b-94be-2c1590adc27d\") " pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:00 crc kubenswrapper[4735]: I0317 04:16:00.506676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:01 crc kubenswrapper[4735]: I0317 04:16:01.334266 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562016-q5k82"] Mar 17 04:16:02 crc kubenswrapper[4735]: I0317 04:16:02.127970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562016-q5k82" event={"ID":"57164bf3-2763-422b-94be-2c1590adc27d","Type":"ContainerStarted","Data":"641acf8c3f36b6d2106aaad602852f1d41e984e9d478f7ed31e2d87a462eff56"} Mar 17 04:16:04 crc kubenswrapper[4735]: I0317 04:16:04.151734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562016-q5k82" event={"ID":"57164bf3-2763-422b-94be-2c1590adc27d","Type":"ContainerStarted","Data":"4a6dcf4eb9b626857a6dc38d00b6c5a2cedb7639ee0797df4e12f021a07b5626"} Mar 17 04:16:04 crc kubenswrapper[4735]: I0317 04:16:04.189302 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562016-q5k82" podStartSLOduration=3.196282975 podStartE2EDuration="4.189276052s" podCreationTimestamp="2026-03-17 04:16:00 +0000 UTC" firstStartedPulling="2026-03-17 04:16:01.942549752 +0000 UTC m=+11187.574782730" lastFinishedPulling="2026-03-17 04:16:02.935542789 +0000 UTC m=+11188.567775807" observedRunningTime="2026-03-17 04:16:04.168263162 +0000 UTC m=+11189.800496150" watchObservedRunningTime="2026-03-17 04:16:04.189276052 +0000 UTC m=+11189.821509050" Mar 17 04:16:06 crc kubenswrapper[4735]: I0317 04:16:06.175296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562016-q5k82" event={"ID":"57164bf3-2763-422b-94be-2c1590adc27d","Type":"ContainerDied","Data":"4a6dcf4eb9b626857a6dc38d00b6c5a2cedb7639ee0797df4e12f021a07b5626"} Mar 17 04:16:06 crc kubenswrapper[4735]: I0317 04:16:06.175773 4735 generic.go:334] "Generic (PLEG): container finished" podID="57164bf3-2763-422b-94be-2c1590adc27d" containerID="4a6dcf4eb9b626857a6dc38d00b6c5a2cedb7639ee0797df4e12f021a07b5626" exitCode=0 Mar 17 04:16:07 crc kubenswrapper[4735]: I0317 04:16:07.722150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:07 crc kubenswrapper[4735]: I0317 04:16:07.883052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89pk\" (UniqueName: \"kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk\") pod \"57164bf3-2763-422b-94be-2c1590adc27d\" (UID: \"57164bf3-2763-422b-94be-2c1590adc27d\") " Mar 17 04:16:07 crc kubenswrapper[4735]: I0317 04:16:07.913197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk" (OuterVolumeSpecName: "kube-api-access-f89pk") pod "57164bf3-2763-422b-94be-2c1590adc27d" (UID: "57164bf3-2763-422b-94be-2c1590adc27d"). InnerVolumeSpecName "kube-api-access-f89pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:16:07 crc kubenswrapper[4735]: I0317 04:16:07.985170 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89pk\" (UniqueName: \"kubernetes.io/projected/57164bf3-2763-422b-94be-2c1590adc27d-kube-api-access-f89pk\") on node \"crc\" DevicePath \"\"" Mar 17 04:16:08 crc kubenswrapper[4735]: I0317 04:16:08.198201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562016-q5k82" event={"ID":"57164bf3-2763-422b-94be-2c1590adc27d","Type":"ContainerDied","Data":"641acf8c3f36b6d2106aaad602852f1d41e984e9d478f7ed31e2d87a462eff56"} Mar 17 04:16:08 crc kubenswrapper[4735]: I0317 04:16:08.198237 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641acf8c3f36b6d2106aaad602852f1d41e984e9d478f7ed31e2d87a462eff56" Mar 17 04:16:08 crc kubenswrapper[4735]: I0317 04:16:08.198287 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562016-q5k82" Mar 17 04:16:08 crc kubenswrapper[4735]: I0317 04:16:08.286778 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562010-wl5k5"] Mar 17 04:16:08 crc kubenswrapper[4735]: I0317 04:16:08.297819 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562010-wl5k5"] Mar 17 04:16:09 crc kubenswrapper[4735]: I0317 04:16:09.099230 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30efd3bc-9591-4870-9fa1-40ca3c8fcd5f" path="/var/lib/kubelet/pods/30efd3bc-9591-4870-9fa1-40ca3c8fcd5f/volumes" Mar 17 04:16:11 crc kubenswrapper[4735]: I0317 04:16:11.604045 4735 scope.go:117] "RemoveContainer" containerID="de60566819d9a2f61de25ebfbd6ac3027af1f879ec578609e999fdb1afe6abbc" Mar 17 04:16:12 crc kubenswrapper[4735]: I0317 04:16:12.606323 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:16:12 crc kubenswrapper[4735]: I0317 04:16:12.606726 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:16:20 crc kubenswrapper[4735]: I0317 04:16:20.952992 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:20 crc kubenswrapper[4735]: E0317 04:16:20.953813 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57164bf3-2763-422b-94be-2c1590adc27d" containerName="oc" Mar 17 04:16:20 crc kubenswrapper[4735]: I0317 04:16:20.953825 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57164bf3-2763-422b-94be-2c1590adc27d" containerName="oc" Mar 17 04:16:20 crc kubenswrapper[4735]: I0317 04:16:20.954736 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="57164bf3-2763-422b-94be-2c1590adc27d" containerName="oc" Mar 17 04:16:20 crc kubenswrapper[4735]: I0317 04:16:20.957603 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:20 crc kubenswrapper[4735]: I0317 04:16:20.987122 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.049605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjfb\" (UniqueName: \"kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.049789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.049955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.152219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.152329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjfb\" (UniqueName: \"kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.152455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.153041 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.153219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.183487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjfb\" (UniqueName: \"kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb\") pod \"community-operators-wd48s\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:21 crc kubenswrapper[4735]: I0317 04:16:21.282812 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:22 crc kubenswrapper[4735]: I0317 04:16:22.199613 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:22 crc kubenswrapper[4735]: I0317 04:16:22.366090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerStarted","Data":"c15d1a2c4f0f81bb1bc3f0410dfb5dd963a6d4a8ccd063747a3aa75ca44f7ec2"} Mar 17 04:16:23 crc kubenswrapper[4735]: I0317 04:16:23.380701 4735 generic.go:334] "Generic (PLEG): container finished" podID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerID="f97cca550e04b20cbd0058742ebfbaeca83f1b8aa7aae8dadc23db744da188da" exitCode=0 Mar 17 04:16:23 crc kubenswrapper[4735]: I0317 04:16:23.381049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerDied","Data":"f97cca550e04b20cbd0058742ebfbaeca83f1b8aa7aae8dadc23db744da188da"} Mar 17 04:16:25 crc kubenswrapper[4735]: I0317 04:16:25.423304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerStarted","Data":"cb080398318c4dbc6ff4313e4e10e6765efc7b211d096cce6df9dff7837cb729"} Mar 17 04:16:26 crc kubenswrapper[4735]: I0317 04:16:26.436296 4735 generic.go:334] "Generic (PLEG): container finished" podID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerID="cb080398318c4dbc6ff4313e4e10e6765efc7b211d096cce6df9dff7837cb729" exitCode=0 Mar 17 04:16:26 crc kubenswrapper[4735]: I0317 04:16:26.436515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerDied","Data":"cb080398318c4dbc6ff4313e4e10e6765efc7b211d096cce6df9dff7837cb729"} Mar 17 04:16:27 crc kubenswrapper[4735]: I0317 04:16:27.448990 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerStarted","Data":"b0125f776647073c6cb9e5f50cedbe9fb60a3a75111855eb456779fab61912c1"} Mar 17 04:16:27 crc kubenswrapper[4735]: I0317 04:16:27.473976 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wd48s" podStartSLOduration=3.813453922 podStartE2EDuration="7.471673925s" podCreationTimestamp="2026-03-17 04:16:20 +0000 UTC" firstStartedPulling="2026-03-17 04:16:23.383713134 +0000 UTC m=+11209.015946122" lastFinishedPulling="2026-03-17 04:16:27.041933147 +0000 UTC m=+11212.674166125" observedRunningTime="2026-03-17 04:16:27.466350296 +0000 UTC m=+11213.098583274" watchObservedRunningTime="2026-03-17 04:16:27.471673925 +0000 UTC m=+11213.103906903" Mar 17 04:16:31 crc kubenswrapper[4735]: I0317 04:16:31.283669 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:31 crc kubenswrapper[4735]: I0317 04:16:31.284315 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:32 crc kubenswrapper[4735]: I0317 04:16:32.380751 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wd48s" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="registry-server" probeResult="failure" output=< Mar 17 04:16:32 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:16:32 crc kubenswrapper[4735]: > Mar 17 04:16:41 crc kubenswrapper[4735]: I0317 04:16:41.340755 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:41 crc kubenswrapper[4735]: I0317 04:16:41.392811 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:41 crc kubenswrapper[4735]: I0317 04:16:41.641537 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.606685 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.610944 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.611046 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.614626 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.615262 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5" gracePeriod=600 Mar 17 04:16:42 crc kubenswrapper[4735]: I0317 04:16:42.650972 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wd48s" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="registry-server" containerID="cri-o://b0125f776647073c6cb9e5f50cedbe9fb60a3a75111855eb456779fab61912c1" gracePeriod=2 Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.662227 4735 generic.go:334] "Generic (PLEG): container finished" podID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerID="b0125f776647073c6cb9e5f50cedbe9fb60a3a75111855eb456779fab61912c1" exitCode=0 Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.662285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerDied","Data":"b0125f776647073c6cb9e5f50cedbe9fb60a3a75111855eb456779fab61912c1"} Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.676220 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5" exitCode=0 Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.676265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5"} Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.676298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e"} Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.678366 4735 scope.go:117] "RemoveContainer" containerID="7222a1b64d7c065b72fd6056bf0125037227704c436bcaea0ab715b5f00b83f0" Mar 17 04:16:43 crc kubenswrapper[4735]: I0317 04:16:43.993058 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.123797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities\") pod \"2299117f-53cb-4e14-b7a9-e99fa5005045\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.123896 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngjfb\" (UniqueName: \"kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb\") pod \"2299117f-53cb-4e14-b7a9-e99fa5005045\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.123986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content\") pod \"2299117f-53cb-4e14-b7a9-e99fa5005045\" (UID: \"2299117f-53cb-4e14-b7a9-e99fa5005045\") " Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.128381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities" (OuterVolumeSpecName: "utilities") pod "2299117f-53cb-4e14-b7a9-e99fa5005045" (UID: "2299117f-53cb-4e14-b7a9-e99fa5005045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.153750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb" (OuterVolumeSpecName: "kube-api-access-ngjfb") pod "2299117f-53cb-4e14-b7a9-e99fa5005045" (UID: "2299117f-53cb-4e14-b7a9-e99fa5005045"). InnerVolumeSpecName "kube-api-access-ngjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.228356 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.228796 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngjfb\" (UniqueName: \"kubernetes.io/projected/2299117f-53cb-4e14-b7a9-e99fa5005045-kube-api-access-ngjfb\") on node \"crc\" DevicePath \"\"" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.242049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2299117f-53cb-4e14-b7a9-e99fa5005045" (UID: "2299117f-53cb-4e14-b7a9-e99fa5005045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.329835 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299117f-53cb-4e14-b7a9-e99fa5005045-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.702888 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wd48s" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.703049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wd48s" event={"ID":"2299117f-53cb-4e14-b7a9-e99fa5005045","Type":"ContainerDied","Data":"c15d1a2c4f0f81bb1bc3f0410dfb5dd963a6d4a8ccd063747a3aa75ca44f7ec2"} Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.703113 4735 scope.go:117] "RemoveContainer" containerID="b0125f776647073c6cb9e5f50cedbe9fb60a3a75111855eb456779fab61912c1" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.731566 4735 scope.go:117] "RemoveContainer" containerID="cb080398318c4dbc6ff4313e4e10e6765efc7b211d096cce6df9dff7837cb729" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.756921 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.763172 4735 scope.go:117] "RemoveContainer" containerID="f97cca550e04b20cbd0058742ebfbaeca83f1b8aa7aae8dadc23db744da188da" Mar 17 04:16:44 crc kubenswrapper[4735]: I0317 04:16:44.766845 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wd48s"] Mar 17 04:16:45 crc kubenswrapper[4735]: I0317 04:16:45.086375 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" path="/var/lib/kubelet/pods/2299117f-53cb-4e14-b7a9-e99fa5005045/volumes" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.779700 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562018-2qkzs"] Mar 17 04:18:00 crc kubenswrapper[4735]: E0317 04:18:00.791802 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="extract-utilities" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.792085 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="extract-utilities" Mar 17 04:18:00 crc kubenswrapper[4735]: E0317 04:18:00.793055 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="extract-content" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.793077 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="extract-content" Mar 17 04:18:00 crc kubenswrapper[4735]: E0317 04:18:00.793096 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="registry-server" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.793103 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="registry-server" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.794878 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2299117f-53cb-4e14-b7a9-e99fa5005045" containerName="registry-server" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.803135 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.812269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs85\" (UniqueName: \"kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85\") pod \"auto-csr-approver-29562018-2qkzs\" (UID: \"168cd015-df53-47df-9187-fcdba9e11bc8\") " pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.817724 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.817738 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.817792 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.900642 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562018-2qkzs"] Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.914693 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs85\" (UniqueName: \"kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85\") pod \"auto-csr-approver-29562018-2qkzs\" (UID: \"168cd015-df53-47df-9187-fcdba9e11bc8\") " pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:00 crc kubenswrapper[4735]: I0317 04:18:00.975608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs85\" (UniqueName: \"kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85\") pod \"auto-csr-approver-29562018-2qkzs\" (UID: \"168cd015-df53-47df-9187-fcdba9e11bc8\") " pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:01 crc kubenswrapper[4735]: I0317 04:18:01.141908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:02 crc kubenswrapper[4735]: I0317 04:18:02.228186 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562018-2qkzs"] Mar 17 04:18:02 crc kubenswrapper[4735]: I0317 04:18:02.610615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" event={"ID":"168cd015-df53-47df-9187-fcdba9e11bc8","Type":"ContainerStarted","Data":"2a31cb8a400825ddd02d72c5dc4657df2627c88e4d8d3e75b40c9e248169e5be"} Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.008639 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.024138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.033406 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.179049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.179303 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzxkd\" (UniqueName: \"kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.179395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.281536 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.281633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzxkd\" (UniqueName: \"kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.281673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.287232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.287770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.339911 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzxkd\" (UniqueName: \"kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd\") pod \"redhat-marketplace-l649s\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:03 crc kubenswrapper[4735]: I0317 04:18:03.368829 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.194017 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.634075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerDied","Data":"beb0d9843f09de0531d793f6f216ffe9c8a5e7d2959413e32416725922314ae2"} Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.634661 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b41b3d9-084b-487f-ab47-181630b8d803" containerID="beb0d9843f09de0531d793f6f216ffe9c8a5e7d2959413e32416725922314ae2" exitCode=0 Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.634747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerStarted","Data":"98e82ae2b86edb99ea831291acb3a57730425a55320cbcf75468838ffa7a6d05"} Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.636500 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" event={"ID":"168cd015-df53-47df-9187-fcdba9e11bc8","Type":"ContainerStarted","Data":"be6cf30276bf5d376434e74ef18e02d9de7039443fcf8a6cb813d768f62f7a67"} Mar 17 04:18:04 crc kubenswrapper[4735]: I0317 04:18:04.701841 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" podStartSLOduration=3.634972435 podStartE2EDuration="4.700634276s" podCreationTimestamp="2026-03-17 04:18:00 +0000 UTC" firstStartedPulling="2026-03-17 04:18:02.281953263 +0000 UTC m=+11307.914186231" lastFinishedPulling="2026-03-17 04:18:03.347615094 +0000 UTC m=+11308.979848072" observedRunningTime="2026-03-17 04:18:04.692400296 +0000 UTC m=+11310.324633274" watchObservedRunningTime="2026-03-17 04:18:04.700634276 +0000 UTC m=+11310.332867254" Mar 17 04:18:06 crc kubenswrapper[4735]: I0317 04:18:06.652459 4735 generic.go:334] "Generic (PLEG): container finished" podID="168cd015-df53-47df-9187-fcdba9e11bc8" containerID="be6cf30276bf5d376434e74ef18e02d9de7039443fcf8a6cb813d768f62f7a67" exitCode=0 Mar 17 04:18:06 crc kubenswrapper[4735]: I0317 04:18:06.652934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" event={"ID":"168cd015-df53-47df-9187-fcdba9e11bc8","Type":"ContainerDied","Data":"be6cf30276bf5d376434e74ef18e02d9de7039443fcf8a6cb813d768f62f7a67"} Mar 17 04:18:06 crc kubenswrapper[4735]: I0317 04:18:06.654697 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerStarted","Data":"5aee76466ffd58405b24f9c0a071a8fd97930fab06c1c6b733eadd5c5545b2da"} Mar 17 04:18:07 crc kubenswrapper[4735]: I0317 04:18:07.679029 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b41b3d9-084b-487f-ab47-181630b8d803" containerID="5aee76466ffd58405b24f9c0a071a8fd97930fab06c1c6b733eadd5c5545b2da" exitCode=0 Mar 17 04:18:07 crc kubenswrapper[4735]: I0317 04:18:07.680340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerDied","Data":"5aee76466ffd58405b24f9c0a071a8fd97930fab06c1c6b733eadd5c5545b2da"} Mar 17 04:18:08 crc kubenswrapper[4735]: I0317 04:18:08.705605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerStarted","Data":"3b1d078adcacdab095fe2415dbc57016a468d767e13172a6cd01e4335a7bc9fa"} Mar 17 04:18:08 crc kubenswrapper[4735]: I0317 04:18:08.751585 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l649s" podStartSLOduration=3.219438847 podStartE2EDuration="6.742419307s" podCreationTimestamp="2026-03-17 04:18:02 +0000 UTC" firstStartedPulling="2026-03-17 04:18:04.635683821 +0000 UTC m=+11310.267916799" lastFinishedPulling="2026-03-17 04:18:08.158664281 +0000 UTC m=+11313.790897259" observedRunningTime="2026-03-17 04:18:08.738611224 +0000 UTC m=+11314.370844202" watchObservedRunningTime="2026-03-17 04:18:08.742419307 +0000 UTC m=+11314.374652285" Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.022316 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.090097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxs85\" (UniqueName: \"kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85\") pod \"168cd015-df53-47df-9187-fcdba9e11bc8\" (UID: \"168cd015-df53-47df-9187-fcdba9e11bc8\") " Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.132153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85" (OuterVolumeSpecName: "kube-api-access-cxs85") pod "168cd015-df53-47df-9187-fcdba9e11bc8" (UID: "168cd015-df53-47df-9187-fcdba9e11bc8"). InnerVolumeSpecName "kube-api-access-cxs85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.194579 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxs85\" (UniqueName: \"kubernetes.io/projected/168cd015-df53-47df-9187-fcdba9e11bc8-kube-api-access-cxs85\") on node \"crc\" DevicePath \"\"" Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.723110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" event={"ID":"168cd015-df53-47df-9187-fcdba9e11bc8","Type":"ContainerDied","Data":"2a31cb8a400825ddd02d72c5dc4657df2627c88e4d8d3e75b40c9e248169e5be"} Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.723466 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562018-2qkzs" Mar 17 04:18:09 crc kubenswrapper[4735]: I0317 04:18:09.723922 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a31cb8a400825ddd02d72c5dc4657df2627c88e4d8d3e75b40c9e248169e5be" Mar 17 04:18:10 crc kubenswrapper[4735]: I0317 04:18:10.210133 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562012-z8ljs"] Mar 17 04:18:10 crc kubenswrapper[4735]: I0317 04:18:10.235956 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562012-z8ljs"] Mar 17 04:18:11 crc kubenswrapper[4735]: I0317 04:18:11.085838 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5e7bfc-e01c-4827-8094-221777002da4" path="/var/lib/kubelet/pods/0e5e7bfc-e01c-4827-8094-221777002da4/volumes" Mar 17 04:18:11 crc kubenswrapper[4735]: I0317 04:18:11.915706 4735 scope.go:117] "RemoveContainer" containerID="f8d82bba27bd4a14b1f539ee09e9396c601c28be31395049991a254cd547bd62" Mar 17 04:18:13 crc kubenswrapper[4735]: I0317 04:18:13.688641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:13 crc kubenswrapper[4735]: I0317 04:18:13.702338 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:14 crc kubenswrapper[4735]: I0317 04:18:14.790723 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-l649s" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" probeResult="failure" output=< Mar 17 04:18:14 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:18:14 crc kubenswrapper[4735]: > Mar 17 04:18:24 crc kubenswrapper[4735]: I0317 04:18:24.453488 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-l649s" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" probeResult="failure" output=< Mar 17 04:18:24 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:18:24 crc kubenswrapper[4735]: > Mar 17 04:18:33 crc kubenswrapper[4735]: I0317 04:18:33.437831 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:33 crc kubenswrapper[4735]: I0317 04:18:33.485273 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:33 crc kubenswrapper[4735]: I0317 04:18:33.551973 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:34 crc kubenswrapper[4735]: I0317 04:18:34.979975 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l649s" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" containerID="cri-o://3b1d078adcacdab095fe2415dbc57016a468d767e13172a6cd01e4335a7bc9fa" gracePeriod=2 Mar 17 04:18:35 crc kubenswrapper[4735]: I0317 04:18:35.987698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerDied","Data":"3b1d078adcacdab095fe2415dbc57016a468d767e13172a6cd01e4335a7bc9fa"} Mar 17 04:18:35 crc kubenswrapper[4735]: I0317 04:18:35.989247 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b41b3d9-084b-487f-ab47-181630b8d803" containerID="3b1d078adcacdab095fe2415dbc57016a468d767e13172a6cd01e4335a7bc9fa" exitCode=0 Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.303065 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.454199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities\") pod \"6b41b3d9-084b-487f-ab47-181630b8d803\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.454267 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content\") pod \"6b41b3d9-084b-487f-ab47-181630b8d803\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.454531 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzxkd\" (UniqueName: \"kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd\") pod \"6b41b3d9-084b-487f-ab47-181630b8d803\" (UID: \"6b41b3d9-084b-487f-ab47-181630b8d803\") " Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.460354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities" (OuterVolumeSpecName: "utilities") pod "6b41b3d9-084b-487f-ab47-181630b8d803" (UID: "6b41b3d9-084b-487f-ab47-181630b8d803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.486325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd" (OuterVolumeSpecName: "kube-api-access-nzxkd") pod "6b41b3d9-084b-487f-ab47-181630b8d803" (UID: "6b41b3d9-084b-487f-ab47-181630b8d803"). InnerVolumeSpecName "kube-api-access-nzxkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.508833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b41b3d9-084b-487f-ab47-181630b8d803" (UID: "6b41b3d9-084b-487f-ab47-181630b8d803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.557457 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzxkd\" (UniqueName: \"kubernetes.io/projected/6b41b3d9-084b-487f-ab47-181630b8d803-kube-api-access-nzxkd\") on node \"crc\" DevicePath \"\"" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.557698 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:18:36 crc kubenswrapper[4735]: I0317 04:18:36.557708 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b41b3d9-084b-487f-ab47-181630b8d803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.005660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l649s" event={"ID":"6b41b3d9-084b-487f-ab47-181630b8d803","Type":"ContainerDied","Data":"98e82ae2b86edb99ea831291acb3a57730425a55320cbcf75468838ffa7a6d05"} Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.005717 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l649s" Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.006259 4735 scope.go:117] "RemoveContainer" containerID="3b1d078adcacdab095fe2415dbc57016a468d767e13172a6cd01e4335a7bc9fa" Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.041906 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.049334 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l649s"] Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.057544 4735 scope.go:117] "RemoveContainer" containerID="5aee76466ffd58405b24f9c0a071a8fd97930fab06c1c6b733eadd5c5545b2da" Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.083643 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" path="/var/lib/kubelet/pods/6b41b3d9-084b-487f-ab47-181630b8d803/volumes" Mar 17 04:18:37 crc kubenswrapper[4735]: I0317 04:18:37.247011 4735 scope.go:117] "RemoveContainer" containerID="beb0d9843f09de0531d793f6f216ffe9c8a5e7d2959413e32416725922314ae2" Mar 17 04:19:12 crc kubenswrapper[4735]: I0317 04:19:12.609609 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:19:12 crc kubenswrapper[4735]: I0317 04:19:12.615012 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:19:33 crc kubenswrapper[4735]: I0317 04:19:33.597080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"db53fa15-c77f-4396-aaae-d0110e90ddb6","Type":"ContainerDied","Data":"8eba0d557b8ff2882e72cfa8bb849be4bed9cc10ec840fb98f4015897c1e5c8d"} Mar 17 04:19:33 crc kubenswrapper[4735]: I0317 04:19:33.597056 4735 generic.go:334] "Generic (PLEG): container finished" podID="db53fa15-c77f-4396-aaae-d0110e90ddb6" containerID="8eba0d557b8ff2882e72cfa8bb849be4bed9cc10ec840fb98f4015897c1e5c8d" exitCode=0 Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.282415 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.373982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374335 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzqn\" (UniqueName: \"kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374409 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374437 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374478 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.374512 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs\") pod \"db53fa15-c77f-4396-aaae-d0110e90ddb6\" (UID: \"db53fa15-c77f-4396-aaae-d0110e90ddb6\") " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.381687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.382617 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data" (OuterVolumeSpecName: "config-data") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.392094 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.395314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn" (OuterVolumeSpecName: "kube-api-access-qfzqn") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "kube-api-access-qfzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.395827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.428104 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.428931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.431115 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.456290 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "db53fa15-c77f-4396-aaae-d0110e90ddb6" (UID: "db53fa15-c77f-4396-aaae-d0110e90ddb6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476306 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476375 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476388 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzqn\" (UniqueName: \"kubernetes.io/projected/db53fa15-c77f-4396-aaae-d0110e90ddb6-kube-api-access-qfzqn\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476398 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476410 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476419 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db53fa15-c77f-4396-aaae-d0110e90ddb6-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476428 4735 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476437 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/db53fa15-c77f-4396-aaae-d0110e90ddb6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.476445 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db53fa15-c77f-4396-aaae-d0110e90ddb6-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.498009 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.578284 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.629656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"db53fa15-c77f-4396-aaae-d0110e90ddb6","Type":"ContainerDied","Data":"759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c"} Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.629742 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="759f9941d8fed608f99f9193fd5ded239d9488686d10bcad5df404df78c7d88c" Mar 17 04:19:36 crc kubenswrapper[4735]: I0317 04:19:36.629831 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Mar 17 04:19:42 crc kubenswrapper[4735]: I0317 04:19:42.606824 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:19:42 crc kubenswrapper[4735]: I0317 04:19:42.607506 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.274015 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 04:19:49 crc kubenswrapper[4735]: E0317 04:19:49.277418 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168cd015-df53-47df-9187-fcdba9e11bc8" containerName="oc" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.277485 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="168cd015-df53-47df-9187-fcdba9e11bc8" containerName="oc" Mar 17 04:19:49 crc kubenswrapper[4735]: E0317 04:19:49.278029 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db53fa15-c77f-4396-aaae-d0110e90ddb6" containerName="tempest-tests-tempest-tests-runner" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.278048 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="db53fa15-c77f-4396-aaae-d0110e90ddb6" containerName="tempest-tests-tempest-tests-runner" Mar 17 04:19:49 crc kubenswrapper[4735]: E0317 04:19:49.278074 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="extract-utilities" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.278083 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="extract-utilities" Mar 17 04:19:49 crc kubenswrapper[4735]: E0317 04:19:49.278097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.278105 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" Mar 17 04:19:49 crc kubenswrapper[4735]: E0317 04:19:49.278150 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="extract-content" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.278161 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="extract-content" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.279973 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="db53fa15-c77f-4396-aaae-d0110e90ddb6" containerName="tempest-tests-tempest-tests-runner" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.280017 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b41b3d9-084b-487f-ab47-181630b8d803" containerName="registry-server" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.280053 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="168cd015-df53-47df-9187-fcdba9e11bc8" containerName="oc" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.283514 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.292469 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mrnrm" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.303532 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.447575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.447781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8bk4\" (UniqueName: \"kubernetes.io/projected/c0b1008b-df7f-4c35-96c5-ddb5629af0f4-kube-api-access-n8bk4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.549940 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8bk4\" (UniqueName: \"kubernetes.io/projected/c0b1008b-df7f-4c35-96c5-ddb5629af0f4-kube-api-access-n8bk4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.550075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.551644 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.595517 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8bk4\" (UniqueName: \"kubernetes.io/projected/c0b1008b-df7f-4c35-96c5-ddb5629af0f4-kube-api-access-n8bk4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.610331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0b1008b-df7f-4c35-96c5-ddb5629af0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:49 crc kubenswrapper[4735]: I0317 04:19:49.909936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 04:19:50 crc kubenswrapper[4735]: I0317 04:19:50.532019 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 04:19:50 crc kubenswrapper[4735]: I0317 04:19:50.566830 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:19:50 crc kubenswrapper[4735]: I0317 04:19:50.762842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c0b1008b-df7f-4c35-96c5-ddb5629af0f4","Type":"ContainerStarted","Data":"0df47b4c042653b905d04f5d30337197119f805f2cdc5a03c104ab0d5ee811d0"} Mar 17 04:19:52 crc kubenswrapper[4735]: I0317 04:19:52.791285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c0b1008b-df7f-4c35-96c5-ddb5629af0f4","Type":"ContainerStarted","Data":"f6c50f81b2cfe4a6920a2fdc9551a29af17019295eb5f822c496ac87c71e370f"} Mar 17 04:19:52 crc kubenswrapper[4735]: I0317 04:19:52.857745 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.589584116 podStartE2EDuration="3.856502659s" podCreationTimestamp="2026-03-17 04:19:49 +0000 UTC" firstStartedPulling="2026-03-17 04:19:50.563357593 +0000 UTC m=+11416.195590581" lastFinishedPulling="2026-03-17 04:19:51.830276146 +0000 UTC m=+11417.462509124" observedRunningTime="2026-03-17 04:19:52.851060247 +0000 UTC m=+11418.483293225" watchObservedRunningTime="2026-03-17 04:19:52.856502659 +0000 UTC m=+11418.488735647" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.196425 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562020-k5rlj"] Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.200525 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.203471 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.204423 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.207787 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.208638 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562020-k5rlj"] Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.368564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7h44\" (UniqueName: \"kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44\") pod \"auto-csr-approver-29562020-k5rlj\" (UID: \"73722a66-1ec7-4810-9026-94373db1e929\") " pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.470162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7h44\" (UniqueName: \"kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44\") pod \"auto-csr-approver-29562020-k5rlj\" (UID: \"73722a66-1ec7-4810-9026-94373db1e929\") " pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.498906 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7h44\" (UniqueName: \"kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44\") pod \"auto-csr-approver-29562020-k5rlj\" (UID: \"73722a66-1ec7-4810-9026-94373db1e929\") " pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:00 crc kubenswrapper[4735]: I0317 04:20:00.523791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:01 crc kubenswrapper[4735]: I0317 04:20:01.055754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562020-k5rlj"] Mar 17 04:20:01 crc kubenswrapper[4735]: I0317 04:20:01.876939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" event={"ID":"73722a66-1ec7-4810-9026-94373db1e929","Type":"ContainerStarted","Data":"9a7bb7bdc4b60bf39861535cd10ce82db3cce4dfa6aa94758e94679451731ad1"} Mar 17 04:20:05 crc kubenswrapper[4735]: I0317 04:20:05.922914 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" event={"ID":"73722a66-1ec7-4810-9026-94373db1e929","Type":"ContainerStarted","Data":"5a65265b3993eb32238489e4075b3293e3f5cd370291127819b098c83f309fc6"} Mar 17 04:20:05 crc kubenswrapper[4735]: I0317 04:20:05.946005 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" podStartSLOduration=2.528368402 podStartE2EDuration="5.945979236s" podCreationTimestamp="2026-03-17 04:20:00 +0000 UTC" firstStartedPulling="2026-03-17 04:20:01.07152266 +0000 UTC m=+11426.703755638" lastFinishedPulling="2026-03-17 04:20:04.489133484 +0000 UTC m=+11430.121366472" observedRunningTime="2026-03-17 04:20:05.940846461 +0000 UTC m=+11431.573079489" watchObservedRunningTime="2026-03-17 04:20:05.945979236 +0000 UTC m=+11431.578212254" Mar 17 04:20:06 crc kubenswrapper[4735]: I0317 04:20:06.955523 4735 generic.go:334] "Generic (PLEG): container finished" podID="73722a66-1ec7-4810-9026-94373db1e929" containerID="5a65265b3993eb32238489e4075b3293e3f5cd370291127819b098c83f309fc6" exitCode=0 Mar 17 04:20:06 crc kubenswrapper[4735]: I0317 04:20:06.955615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" event={"ID":"73722a66-1ec7-4810-9026-94373db1e929","Type":"ContainerDied","Data":"5a65265b3993eb32238489e4075b3293e3f5cd370291127819b098c83f309fc6"} Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.344818 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.479247 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7h44\" (UniqueName: \"kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44\") pod \"73722a66-1ec7-4810-9026-94373db1e929\" (UID: \"73722a66-1ec7-4810-9026-94373db1e929\") " Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.485182 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44" (OuterVolumeSpecName: "kube-api-access-z7h44") pod "73722a66-1ec7-4810-9026-94373db1e929" (UID: "73722a66-1ec7-4810-9026-94373db1e929"). InnerVolumeSpecName "kube-api-access-z7h44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.582234 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7h44\" (UniqueName: \"kubernetes.io/projected/73722a66-1ec7-4810-9026-94373db1e929-kube-api-access-z7h44\") on node \"crc\" DevicePath \"\"" Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.983828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" event={"ID":"73722a66-1ec7-4810-9026-94373db1e929","Type":"ContainerDied","Data":"9a7bb7bdc4b60bf39861535cd10ce82db3cce4dfa6aa94758e94679451731ad1"} Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.983890 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a7bb7bdc4b60bf39861535cd10ce82db3cce4dfa6aa94758e94679451731ad1" Mar 17 04:20:08 crc kubenswrapper[4735]: I0317 04:20:08.983966 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562020-k5rlj" Mar 17 04:20:09 crc kubenswrapper[4735]: I0317 04:20:09.437315 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562014-vn2lc"] Mar 17 04:20:09 crc kubenswrapper[4735]: I0317 04:20:09.447597 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562014-vn2lc"] Mar 17 04:20:11 crc kubenswrapper[4735]: I0317 04:20:11.087581 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cabd609-66ca-4a0c-a84b-0f0d05aa6d49" path="/var/lib/kubelet/pods/1cabd609-66ca-4a0c-a84b-0f0d05aa6d49/volumes" Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.343065 4735 scope.go:117] "RemoveContainer" containerID="5cf13667c753028bca250ef6d6d7a5b6781aad7cce08f994766d6dfa58863fe2" Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.606108 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.606619 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.606774 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.607456 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:20:12 crc kubenswrapper[4735]: I0317 04:20:12.607582 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" gracePeriod=600 Mar 17 04:20:12 crc kubenswrapper[4735]: E0317 04:20:12.735502 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:20:13 crc kubenswrapper[4735]: I0317 04:20:13.030174 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" exitCode=0 Mar 17 04:20:13 crc kubenswrapper[4735]: I0317 04:20:13.030234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e"} Mar 17 04:20:13 crc kubenswrapper[4735]: I0317 04:20:13.030275 4735 scope.go:117] "RemoveContainer" containerID="a9a7238428e6cff794403881a93370e2d2f5e3267498b568e22796fd898484e5" Mar 17 04:20:13 crc kubenswrapper[4735]: I0317 04:20:13.031155 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:20:13 crc kubenswrapper[4735]: E0317 04:20:13.031461 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.730322 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfgbl/must-gather-586s4"] Mar 17 04:20:15 crc kubenswrapper[4735]: E0317 04:20:15.730959 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73722a66-1ec7-4810-9026-94373db1e929" containerName="oc" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.730972 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73722a66-1ec7-4810-9026-94373db1e929" containerName="oc" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.731193 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="73722a66-1ec7-4810-9026-94373db1e929" containerName="oc" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.750394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.757202 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tfgbl"/"kube-root-ca.crt" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.757444 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tfgbl"/"openshift-service-ca.crt" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.803190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfgbl/must-gather-586s4"] Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.907208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mcw\" (UniqueName: \"kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:15 crc kubenswrapper[4735]: I0317 04:20:15.907838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.010847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.013102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mcw\" (UniqueName: \"kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.011402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.033363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mcw\" (UniqueName: \"kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw\") pod \"must-gather-586s4\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.122450 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:20:16 crc kubenswrapper[4735]: I0317 04:20:16.626940 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfgbl/must-gather-586s4"] Mar 17 04:20:16 crc kubenswrapper[4735]: W0317 04:20:16.629539 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f7fc6c_fba5_429d_a031_19607709268d.slice/crio-12bba9d519647c41108f0b018aac5dbada41b6b051a1b03664410ea50d059f8f WatchSource:0}: Error finding container 12bba9d519647c41108f0b018aac5dbada41b6b051a1b03664410ea50d059f8f: Status 404 returned error can't find the container with id 12bba9d519647c41108f0b018aac5dbada41b6b051a1b03664410ea50d059f8f Mar 17 04:20:17 crc kubenswrapper[4735]: I0317 04:20:17.070991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/must-gather-586s4" event={"ID":"83f7fc6c-fba5-429d-a031-19607709268d","Type":"ContainerStarted","Data":"12bba9d519647c41108f0b018aac5dbada41b6b051a1b03664410ea50d059f8f"} Mar 17 04:20:25 crc kubenswrapper[4735]: I0317 04:20:25.085779 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:20:25 crc kubenswrapper[4735]: E0317 04:20:25.111270 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:20:26 crc kubenswrapper[4735]: I0317 04:20:26.175395 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/must-gather-586s4" event={"ID":"83f7fc6c-fba5-429d-a031-19607709268d","Type":"ContainerStarted","Data":"7bcb43390a975a1b5dc00a7724b87f89fdcc41b07a3375cde841761a849ba6e5"} Mar 17 04:20:26 crc kubenswrapper[4735]: I0317 04:20:26.175642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/must-gather-586s4" event={"ID":"83f7fc6c-fba5-429d-a031-19607709268d","Type":"ContainerStarted","Data":"5723ad18f329438013182ef228c80c5551b9f8bde15255a7d675b706e26640dc"} Mar 17 04:20:26 crc kubenswrapper[4735]: I0317 04:20:26.196966 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfgbl/must-gather-586s4" podStartSLOduration=2.762081653 podStartE2EDuration="11.196931787s" podCreationTimestamp="2026-03-17 04:20:15 +0000 UTC" firstStartedPulling="2026-03-17 04:20:16.630692807 +0000 UTC m=+11442.262925785" lastFinishedPulling="2026-03-17 04:20:25.065542931 +0000 UTC m=+11450.697775919" observedRunningTime="2026-03-17 04:20:26.191983307 +0000 UTC m=+11451.824216285" watchObservedRunningTime="2026-03-17 04:20:26.196931787 +0000 UTC m=+11451.829164805" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.482944 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-27rcv"] Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.484505 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.486530 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tfgbl"/"default-dockercfg-8clfw" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.648558 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7ls\" (UniqueName: \"kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.648639 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.750791 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7ls\" (UniqueName: \"kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.750874 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.751665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.772470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7ls\" (UniqueName: \"kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls\") pod \"crc-debug-27rcv\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:31 crc kubenswrapper[4735]: I0317 04:20:31.800447 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:20:32 crc kubenswrapper[4735]: I0317 04:20:32.236088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" event={"ID":"aae4fb95-f1ec-425e-b12c-4c5dce98c109","Type":"ContainerStarted","Data":"91cee1b3395210a197728e510d740950327c9938acadc834f9e916fa8bade988"} Mar 17 04:20:37 crc kubenswrapper[4735]: I0317 04:20:37.075987 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:20:37 crc kubenswrapper[4735]: E0317 04:20:37.076628 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:20:44 crc kubenswrapper[4735]: I0317 04:20:44.343076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" event={"ID":"aae4fb95-f1ec-425e-b12c-4c5dce98c109","Type":"ContainerStarted","Data":"d382c72f7ce3765cc4af5c233dd8ae35b6e204e3e475d39c9efd591883b6a73f"} Mar 17 04:20:44 crc kubenswrapper[4735]: I0317 04:20:44.360829 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" podStartSLOduration=1.983589915 podStartE2EDuration="13.360813821s" podCreationTimestamp="2026-03-17 04:20:31 +0000 UTC" firstStartedPulling="2026-03-17 04:20:31.835611448 +0000 UTC m=+11457.467844426" lastFinishedPulling="2026-03-17 04:20:43.212835354 +0000 UTC m=+11468.845068332" observedRunningTime="2026-03-17 04:20:44.355647996 +0000 UTC m=+11469.987880984" watchObservedRunningTime="2026-03-17 04:20:44.360813821 +0000 UTC m=+11469.993046799" Mar 17 04:20:50 crc kubenswrapper[4735]: I0317 04:20:50.072948 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:20:50 crc kubenswrapper[4735]: E0317 04:20:50.073689 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:05 crc kubenswrapper[4735]: I0317 04:21:05.078567 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:21:05 crc kubenswrapper[4735]: E0317 04:21:05.079268 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:17 crc kubenswrapper[4735]: I0317 04:21:17.073769 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:21:17 crc kubenswrapper[4735]: E0317 04:21:17.074468 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:28 crc kubenswrapper[4735]: I0317 04:21:28.074347 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:21:28 crc kubenswrapper[4735]: E0317 04:21:28.075772 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:30 crc kubenswrapper[4735]: I0317 04:21:30.728156 4735 generic.go:334] "Generic (PLEG): container finished" podID="aae4fb95-f1ec-425e-b12c-4c5dce98c109" containerID="d382c72f7ce3765cc4af5c233dd8ae35b6e204e3e475d39c9efd591883b6a73f" exitCode=0 Mar 17 04:21:30 crc kubenswrapper[4735]: I0317 04:21:30.728603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" event={"ID":"aae4fb95-f1ec-425e-b12c-4c5dce98c109","Type":"ContainerDied","Data":"d382c72f7ce3765cc4af5c233dd8ae35b6e204e3e475d39c9efd591883b6a73f"} Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.842390 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.876066 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-27rcv"] Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.894026 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-27rcv"] Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.900608 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host\") pod \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.900715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7ls\" (UniqueName: \"kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls\") pod \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\" (UID: \"aae4fb95-f1ec-425e-b12c-4c5dce98c109\") " Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.900746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host" (OuterVolumeSpecName: "host") pod "aae4fb95-f1ec-425e-b12c-4c5dce98c109" (UID: "aae4fb95-f1ec-425e-b12c-4c5dce98c109"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.901141 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae4fb95-f1ec-425e-b12c-4c5dce98c109-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:31 crc kubenswrapper[4735]: I0317 04:21:31.907797 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls" (OuterVolumeSpecName: "kube-api-access-cb7ls") pod "aae4fb95-f1ec-425e-b12c-4c5dce98c109" (UID: "aae4fb95-f1ec-425e-b12c-4c5dce98c109"). InnerVolumeSpecName "kube-api-access-cb7ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:21:32 crc kubenswrapper[4735]: I0317 04:21:32.002509 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7ls\" (UniqueName: \"kubernetes.io/projected/aae4fb95-f1ec-425e-b12c-4c5dce98c109-kube-api-access-cb7ls\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:32 crc kubenswrapper[4735]: I0317 04:21:32.747451 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91cee1b3395210a197728e510d740950327c9938acadc834f9e916fa8bade988" Mar 17 04:21:32 crc kubenswrapper[4735]: I0317 04:21:32.747557 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-27rcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.087073 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae4fb95-f1ec-425e-b12c-4c5dce98c109" path="/var/lib/kubelet/pods/aae4fb95-f1ec-425e-b12c-4c5dce98c109/volumes" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.113315 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-fdjcv"] Mar 17 04:21:33 crc kubenswrapper[4735]: E0317 04:21:33.113831 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae4fb95-f1ec-425e-b12c-4c5dce98c109" containerName="container-00" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.113871 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae4fb95-f1ec-425e-b12c-4c5dce98c109" containerName="container-00" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.114306 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae4fb95-f1ec-425e-b12c-4c5dce98c109" containerName="container-00" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.114948 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.118344 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tfgbl"/"default-dockercfg-8clfw" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.227778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.228049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz4wn\" (UniqueName: \"kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.330077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.330635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz4wn\" (UniqueName: \"kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.330271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.352242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz4wn\" (UniqueName: \"kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn\") pod \"crc-debug-fdjcv\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.433289 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.755204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" event={"ID":"8d410e51-4dd2-4744-93da-9f1e31e03a51","Type":"ContainerStarted","Data":"bdafc6053c26298391bb26adb83aae496be9ae81631ea64fead4295ed0a47d18"} Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.755249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" event={"ID":"8d410e51-4dd2-4744-93da-9f1e31e03a51","Type":"ContainerStarted","Data":"dcdd5833adfd49af922d0afb7bc3b5786dc5a4e4ad5ed81c271b4ec6a8f13aea"} Mar 17 04:21:33 crc kubenswrapper[4735]: I0317 04:21:33.772368 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" podStartSLOduration=0.77233362 podStartE2EDuration="772.33362ms" podCreationTimestamp="2026-03-17 04:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:21:33.764880379 +0000 UTC m=+11519.397113357" watchObservedRunningTime="2026-03-17 04:21:33.77233362 +0000 UTC m=+11519.404566598" Mar 17 04:21:34 crc kubenswrapper[4735]: I0317 04:21:34.765211 4735 generic.go:334] "Generic (PLEG): container finished" podID="8d410e51-4dd2-4744-93da-9f1e31e03a51" containerID="bdafc6053c26298391bb26adb83aae496be9ae81631ea64fead4295ed0a47d18" exitCode=0 Mar 17 04:21:34 crc kubenswrapper[4735]: I0317 04:21:34.765249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" event={"ID":"8d410e51-4dd2-4744-93da-9f1e31e03a51","Type":"ContainerDied","Data":"bdafc6053c26298391bb26adb83aae496be9ae81631ea64fead4295ed0a47d18"} Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.866420 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.897418 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-fdjcv"] Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.904809 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-fdjcv"] Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.971811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz4wn\" (UniqueName: \"kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn\") pod \"8d410e51-4dd2-4744-93da-9f1e31e03a51\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.971886 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host\") pod \"8d410e51-4dd2-4744-93da-9f1e31e03a51\" (UID: \"8d410e51-4dd2-4744-93da-9f1e31e03a51\") " Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.972007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host" (OuterVolumeSpecName: "host") pod "8d410e51-4dd2-4744-93da-9f1e31e03a51" (UID: "8d410e51-4dd2-4744-93da-9f1e31e03a51"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.972532 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d410e51-4dd2-4744-93da-9f1e31e03a51-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:35 crc kubenswrapper[4735]: I0317 04:21:35.982608 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn" (OuterVolumeSpecName: "kube-api-access-xz4wn") pod "8d410e51-4dd2-4744-93da-9f1e31e03a51" (UID: "8d410e51-4dd2-4744-93da-9f1e31e03a51"). InnerVolumeSpecName "kube-api-access-xz4wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:21:36 crc kubenswrapper[4735]: I0317 04:21:36.075434 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz4wn\" (UniqueName: \"kubernetes.io/projected/8d410e51-4dd2-4744-93da-9f1e31e03a51-kube-api-access-xz4wn\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:36 crc kubenswrapper[4735]: I0317 04:21:36.782218 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcdd5833adfd49af922d0afb7bc3b5786dc5a4e4ad5ed81c271b4ec6a8f13aea" Mar 17 04:21:36 crc kubenswrapper[4735]: I0317 04:21:36.782276 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-fdjcv" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.082982 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d410e51-4dd2-4744-93da-9f1e31e03a51" path="/var/lib/kubelet/pods/8d410e51-4dd2-4744-93da-9f1e31e03a51/volumes" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.083482 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-wd798"] Mar 17 04:21:37 crc kubenswrapper[4735]: E0317 04:21:37.083802 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d410e51-4dd2-4744-93da-9f1e31e03a51" containerName="container-00" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.083815 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d410e51-4dd2-4744-93da-9f1e31e03a51" containerName="container-00" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.084050 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d410e51-4dd2-4744-93da-9f1e31e03a51" containerName="container-00" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.084643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.086227 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tfgbl"/"default-dockercfg-8clfw" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.194330 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.195155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njp7p\" (UniqueName: \"kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.297294 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.297389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njp7p\" (UniqueName: \"kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.297467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.323356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njp7p\" (UniqueName: \"kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p\") pod \"crc-debug-wd798\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.400613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:37 crc kubenswrapper[4735]: W0317 04:21:37.443946 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ed269d8_8133_42dd_bf0c_ef069f530a92.slice/crio-b93f4bfba49c705a1c07b4dd14ddf728c03bb69692bde5550afd86f3cdc8d0f7 WatchSource:0}: Error finding container b93f4bfba49c705a1c07b4dd14ddf728c03bb69692bde5550afd86f3cdc8d0f7: Status 404 returned error can't find the container with id b93f4bfba49c705a1c07b4dd14ddf728c03bb69692bde5550afd86f3cdc8d0f7 Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.792540 4735 generic.go:334] "Generic (PLEG): container finished" podID="6ed269d8-8133-42dd-bf0c-ef069f530a92" containerID="93c395d5dc30dc2b573dccc98c912367d786cd421f1ddfa781514a9a362af52f" exitCode=0 Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.792587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-wd798" event={"ID":"6ed269d8-8133-42dd-bf0c-ef069f530a92","Type":"ContainerDied","Data":"93c395d5dc30dc2b573dccc98c912367d786cd421f1ddfa781514a9a362af52f"} Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.792625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/crc-debug-wd798" event={"ID":"6ed269d8-8133-42dd-bf0c-ef069f530a92","Type":"ContainerStarted","Data":"b93f4bfba49c705a1c07b4dd14ddf728c03bb69692bde5550afd86f3cdc8d0f7"} Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.831196 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-wd798"] Mar 17 04:21:37 crc kubenswrapper[4735]: I0317 04:21:37.840059 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tfgbl/crc-debug-wd798"] Mar 17 04:21:38 crc kubenswrapper[4735]: I0317 04:21:38.887987 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.028995 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njp7p\" (UniqueName: \"kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p\") pod \"6ed269d8-8133-42dd-bf0c-ef069f530a92\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.029429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host\") pod \"6ed269d8-8133-42dd-bf0c-ef069f530a92\" (UID: \"6ed269d8-8133-42dd-bf0c-ef069f530a92\") " Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.029955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host" (OuterVolumeSpecName: "host") pod "6ed269d8-8133-42dd-bf0c-ef069f530a92" (UID: "6ed269d8-8133-42dd-bf0c-ef069f530a92"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.037284 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p" (OuterVolumeSpecName: "kube-api-access-njp7p") pod "6ed269d8-8133-42dd-bf0c-ef069f530a92" (UID: "6ed269d8-8133-42dd-bf0c-ef069f530a92"). InnerVolumeSpecName "kube-api-access-njp7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.090787 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed269d8-8133-42dd-bf0c-ef069f530a92" path="/var/lib/kubelet/pods/6ed269d8-8133-42dd-bf0c-ef069f530a92/volumes" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.131736 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njp7p\" (UniqueName: \"kubernetes.io/projected/6ed269d8-8133-42dd-bf0c-ef069f530a92-kube-api-access-njp7p\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.132240 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ed269d8-8133-42dd-bf0c-ef069f530a92-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.811682 4735 scope.go:117] "RemoveContainer" containerID="93c395d5dc30dc2b573dccc98c912367d786cd421f1ddfa781514a9a362af52f" Mar 17 04:21:39 crc kubenswrapper[4735]: I0317 04:21:39.811991 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/crc-debug-wd798" Mar 17 04:21:42 crc kubenswrapper[4735]: I0317 04:21:42.072738 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:21:42 crc kubenswrapper[4735]: E0317 04:21:42.073236 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:57 crc kubenswrapper[4735]: I0317 04:21:57.074065 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:21:57 crc kubenswrapper[4735]: E0317 04:21:57.074723 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:21:57 crc kubenswrapper[4735]: I0317 04:21:57.832301 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648b74c4fd-9pw4q_f33736df-541b-4d36-bde6-08767c233625/barbican-api/0.log" Mar 17 04:21:57 crc kubenswrapper[4735]: I0317 04:21:57.994629 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648b74c4fd-9pw4q_f33736df-541b-4d36-bde6-08767c233625/barbican-api-log/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.060346 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-55996449db-5bw4p_8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f/barbican-keystone-listener/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.275149 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-55996449db-5bw4p_8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f/barbican-keystone-listener-log/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.333320 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-764bc646f-l9wnd_e4e10703-a66a-4bb3-b523-8d5d6e772c04/barbican-worker/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.370924 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-764bc646f-l9wnd_e4e10703-a66a-4bb3-b523-8d5d6e772c04/barbican-worker-log/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.548087 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9_33665f4a-8504-4b35-9850-d2a567b93418/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.749224 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/ceilometer-central-agent/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.840670 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/ceilometer-notification-agent/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.885461 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/sg-core/0.log" Mar 17 04:21:58 crc kubenswrapper[4735]: I0317 04:21:58.910692 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/proxy-httpd/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.236315 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e91a303-0695-4862-ad03-3c9828b5a3a5/cinder-api/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.243518 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e91a303-0695-4862-ad03-3c9828b5a3a5/cinder-api-log/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.311842 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28b4ea44-a1cd-4c67-935c-abdfa2ddb16a/cinder-scheduler/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.473514 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28b4ea44-a1cd-4c67-935c-abdfa2ddb16a/probe/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.525826 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp_ecd2012c-981d-4582-8dde-afedd29a4108/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.675298 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9jg77_eb6e1c65-c874-45d5-9c38-203b56f1385c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.787807 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/init/0.log" Mar 17 04:21:59 crc kubenswrapper[4735]: I0317 04:21:59.969745 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/init/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.094850 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj_8697513f-4f89-4aad-9315-2e2053207433/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.146937 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562022-qffmh"] Mar 17 04:22:00 crc kubenswrapper[4735]: E0317 04:22:00.147386 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed269d8-8133-42dd-bf0c-ef069f530a92" containerName="container-00" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.147407 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed269d8-8133-42dd-bf0c-ef069f530a92" containerName="container-00" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.147676 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed269d8-8133-42dd-bf0c-ef069f530a92" containerName="container-00" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.150167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.153426 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.153647 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.153821 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.159681 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562022-qffmh"] Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.196561 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvcp\" (UniqueName: \"kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp\") pod \"auto-csr-approver-29562022-qffmh\" (UID: \"02dc698c-8baa-46a7-ab60-936c016790b9\") " pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.197378 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/dnsmasq-dns/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.298361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvcp\" (UniqueName: \"kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp\") pod \"auto-csr-approver-29562022-qffmh\" (UID: \"02dc698c-8baa-46a7-ab60-936c016790b9\") " pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.314955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvcp\" (UniqueName: \"kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp\") pod \"auto-csr-approver-29562022-qffmh\" (UID: \"02dc698c-8baa-46a7-ab60-936c016790b9\") " pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.364309 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc51e2e8-eeb0-49e2-8893-f2d1aca3be95/glance-log/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.398407 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc51e2e8-eeb0-49e2-8893-f2d1aca3be95/glance-httpd/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.467974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.633215 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e8f3c02b-c96f-469b-86df-21a5342bca54/glance-log/0.log" Mar 17 04:22:00 crc kubenswrapper[4735]: I0317 04:22:00.765042 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e8f3c02b-c96f-469b-86df-21a5342bca54/glance-httpd/0.log" Mar 17 04:22:01 crc kubenswrapper[4735]: I0317 04:22:01.025879 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562022-qffmh"] Mar 17 04:22:01 crc kubenswrapper[4735]: I0317 04:22:01.367654 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-55bb658757-9w8c7_4da6ec53-9801-4b92-b096-a55e63155103/heat-engine/0.log" Mar 17 04:22:01 crc kubenswrapper[4735]: I0317 04:22:01.869820 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76cdf95cd8-vx5pd_fc3f6d90-40e7-4962-b788-1e9924edb48f/horizon/0.log" Mar 17 04:22:02 crc kubenswrapper[4735]: I0317 04:22:02.016347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562022-qffmh" event={"ID":"02dc698c-8baa-46a7-ab60-936c016790b9","Type":"ContainerStarted","Data":"f25dbaa7ab59be248d2d9aade171e6f24a42fd0a3a7fb7f655da83bdf9a381c0"} Mar 17 04:22:02 crc kubenswrapper[4735]: I0317 04:22:02.403367 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk_9853dc02-da1f-43e7-ab8a-aa9aeae6ff65/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:02 crc kubenswrapper[4735]: I0317 04:22:02.669450 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pgc7b_cb9f2661-6c58-4a56-8080-2701d2dc456a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:02 crc kubenswrapper[4735]: I0317 04:22:02.810999 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6c568f7f98-2x6ww_0ac8a8c1-5baf-48c7-886f-8701fa5ce663/heat-api/0.log" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.028840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562022-qffmh" event={"ID":"02dc698c-8baa-46a7-ab60-936c016790b9","Type":"ContainerStarted","Data":"e0b4583f559a4bcf68ad48a52b38378940c0ab5d18d808b4b307e2509ccb70e2"} Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.056550 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562022-qffmh" podStartSLOduration=2.031561593 podStartE2EDuration="3.056528215s" podCreationTimestamp="2026-03-17 04:22:00 +0000 UTC" firstStartedPulling="2026-03-17 04:22:01.037204384 +0000 UTC m=+11546.669437352" lastFinishedPulling="2026-03-17 04:22:02.062170996 +0000 UTC m=+11547.694403974" observedRunningTime="2026-03-17 04:22:03.045139609 +0000 UTC m=+11548.677372587" watchObservedRunningTime="2026-03-17 04:22:03.056528215 +0000 UTC m=+11548.688761193" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.081532 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561881-lgzt4_9576ffce-8ade-47e7-8f60-c928cb0116b5/keystone-cron/0.log" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.273117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7fb7d7d79f-sbp8m_7534832e-8997-4cf3-8f8a-c8f91debac15/heat-cfnapi/0.log" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.318369 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561941-shsnj_fde14ac4-f83c-412e-ba07-d1ce0e8368d6/keystone-cron/0.log" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.584083 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29562001-vwd6p_1115ef04-a8ec-44b6-941d-a540ac1c895b/keystone-cron/0.log" Mar 17 04:22:03 crc kubenswrapper[4735]: I0317 04:22:03.909248 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0da67913-d53f-43fb-8100-c10acda35893/kube-state-metrics/0.log" Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.034761 4735 generic.go:334] "Generic (PLEG): container finished" podID="02dc698c-8baa-46a7-ab60-936c016790b9" containerID="e0b4583f559a4bcf68ad48a52b38378940c0ab5d18d808b4b307e2509ccb70e2" exitCode=0 Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.034800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562022-qffmh" event={"ID":"02dc698c-8baa-46a7-ab60-936c016790b9","Type":"ContainerDied","Data":"e0b4583f559a4bcf68ad48a52b38378940c0ab5d18d808b4b307e2509ccb70e2"} Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.064159 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vswmz_7354d451-8e24-4c65-855c-e6a33c66d134/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.262676 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76cdf95cd8-vx5pd_fc3f6d90-40e7-4962-b788-1e9924edb48f/horizon-log/0.log" Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.646235 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ddf9f84d9-bh5g9_c6f23de1-8a7f-4f5f-986a-df17ee65d752/neutron-httpd/0.log" Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.791497 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9455b57b9-66mtr_5a6d3039-699c-477b-a835-2b6fa2709dde/keystone-api/0.log" Mar 17 04:22:04 crc kubenswrapper[4735]: I0317 04:22:04.887607 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r_3a17772e-46c6-4b61-8fd1-0626f2097355/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.454981 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.599195 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkvcp\" (UniqueName: \"kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp\") pod \"02dc698c-8baa-46a7-ab60-936c016790b9\" (UID: \"02dc698c-8baa-46a7-ab60-936c016790b9\") " Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.612163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp" (OuterVolumeSpecName: "kube-api-access-gkvcp") pod "02dc698c-8baa-46a7-ab60-936c016790b9" (UID: "02dc698c-8baa-46a7-ab60-936c016790b9"). InnerVolumeSpecName "kube-api-access-gkvcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.700948 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkvcp\" (UniqueName: \"kubernetes.io/projected/02dc698c-8baa-46a7-ab60-936c016790b9-kube-api-access-gkvcp\") on node \"crc\" DevicePath \"\"" Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.856454 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ddf9f84d9-bh5g9_c6f23de1-8a7f-4f5f-986a-df17ee65d752/neutron-api/0.log" Mar 17 04:22:05 crc kubenswrapper[4735]: I0317 04:22:05.975349 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8aaa374c-be86-4c12-81f6-cef430c7a160/nova-cell0-conductor-conductor/0.log" Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.053649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562022-qffmh" event={"ID":"02dc698c-8baa-46a7-ab60-936c016790b9","Type":"ContainerDied","Data":"f25dbaa7ab59be248d2d9aade171e6f24a42fd0a3a7fb7f655da83bdf9a381c0"} Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.053689 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25dbaa7ab59be248d2d9aade171e6f24a42fd0a3a7fb7f655da83bdf9a381c0" Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.053745 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562022-qffmh" Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.126027 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562016-q5k82"] Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.141297 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562016-q5k82"] Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.368152 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0959245b-5ccd-4b6a-925a-1032c2761405/nova-cell1-conductor-conductor/0.log" Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.819312 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f7a2283d-990f-46e0-b9e7-e2891468873c/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 04:22:06 crc kubenswrapper[4735]: I0317 04:22:06.931570 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8s2pf_69e281de-fd13-450e-acf8-ee5f0561f0b9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:07 crc kubenswrapper[4735]: I0317 04:22:07.087447 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57164bf3-2763-422b-94be-2c1590adc27d" path="/var/lib/kubelet/pods/57164bf3-2763-422b-94be-2c1590adc27d/volumes" Mar 17 04:22:07 crc kubenswrapper[4735]: I0317 04:22:07.282577 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6a8bc38a-47a7-4623-bf64-ac77d137ceb4/nova-metadata-log/0.log" Mar 17 04:22:08 crc kubenswrapper[4735]: I0317 04:22:08.358921 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976/nova-api-log/0.log" Mar 17 04:22:08 crc kubenswrapper[4735]: I0317 04:22:08.509343 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b/nova-scheduler-scheduler/0.log" Mar 17 04:22:08 crc kubenswrapper[4735]: I0317 04:22:08.760338 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/mysql-bootstrap/0.log" Mar 17 04:22:08 crc kubenswrapper[4735]: I0317 04:22:08.924073 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/mysql-bootstrap/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.039540 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/galera/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.261875 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/mysql-bootstrap/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.402238 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6a8bc38a-47a7-4623-bf64-ac77d137ceb4/nova-metadata-metadata/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.505430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/galera/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.574769 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/mysql-bootstrap/0.log" Mar 17 04:22:09 crc kubenswrapper[4735]: I0317 04:22:09.869908 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c/openstackclient/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.029290 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9ngz2_8d35e229-2eeb-4843-a9a2-763156affef5/ovn-controller/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.098712 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lg72z_5bf91339-aeb6-4e5f-b712-295b0fcd38b9/openstack-network-exporter/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.231723 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976/nova-api-api/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.377053 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server-init/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.509215 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server-init/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.607065 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.644211 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovs-vswitchd/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.820045 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pwsgn_039a6699-7fc5-48c1-89b0-0e3946f69349/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:10 crc kubenswrapper[4735]: I0317 04:22:10.870821 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff00868f-9bd8-42cd-818b-b69373acd188/ovn-northd/0.log" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.077639 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:22:11 crc kubenswrapper[4735]: E0317 04:22:11.077830 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.127499 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff00868f-9bd8-42cd-818b-b69373acd188/openstack-network-exporter/0.log" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.368262 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8507883-2fad-4c5c-90c9-58eb17711bb3/openstack-network-exporter/0.log" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.395748 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8507883-2fad-4c5c-90c9-58eb17711bb3/ovsdbserver-nb/0.log" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.640599 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6be8d2b-aab5-415b-bb98-298563e9f719/ovsdbserver-sb/0.log" Mar 17 04:22:11 crc kubenswrapper[4735]: I0317 04:22:11.651140 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6be8d2b-aab5-415b-bb98-298563e9f719/openstack-network-exporter/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.146509 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/setup-container/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.236812 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/setup-container/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.433532 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/rabbitmq/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.453392 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9b8f4bc48-rdtm4_d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a/placement-api/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.514395 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9b8f4bc48-rdtm4_d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a/placement-log/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.530915 4735 scope.go:117] "RemoveContainer" containerID="4a6dcf4eb9b626857a6dc38d00b6c5a2cedb7639ee0797df4e12f021a07b5626" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.732671 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/setup-container/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.950579 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/rabbitmq/0.log" Mar 17 04:22:12 crc kubenswrapper[4735]: I0317 04:22:12.966774 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/setup-container/0.log" Mar 17 04:22:13 crc kubenswrapper[4735]: I0317 04:22:13.059906 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9_2a7a18af-48a3-47f8-8318-e6070738d82f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:13 crc kubenswrapper[4735]: I0317 04:22:13.612293 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6t9fc_afc10dcf-31e4-477c-8f5c-310c6da60988/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:13 crc kubenswrapper[4735]: I0317 04:22:13.612582 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn_9cc576b2-b1b2-426f-bec9-ebcd2ad12150/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:13 crc kubenswrapper[4735]: I0317 04:22:13.614892 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-p8mnq_6136d26b-0db1-42a3-80df-aac1dd6daf50/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:13 crc kubenswrapper[4735]: I0317 04:22:13.978104 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4j8fl_62ab1715-a702-460a-9195-4646d98e2620/ssh-known-hosts-edpm-deployment/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.238782 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j75nx_a2639420-3978-4b55-a81e-f1e770e09cf2/swift-ring-rebalance/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.262943 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544585c649-tthkx_944ea7da-36e9-4cb9-a65c-ff8730df5107/proxy-server/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.492825 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544585c649-tthkx_944ea7da-36e9-4cb9-a65c-ff8730df5107/proxy-httpd/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.595358 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-reaper/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.599224 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-auditor/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.785801 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-replicator/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.794464 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-server/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.865609 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-auditor/0.log" Mar 17 04:22:14 crc kubenswrapper[4735]: I0317 04:22:14.932971 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-replicator/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.005125 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-server/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.043885 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-updater/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.148548 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-auditor/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.209303 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-expirer/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.281838 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-server/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.324447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-replicator/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.421143 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-updater/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.440768 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/rsync/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.529644 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/swift-recon-cron/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.767966 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs_d4354aba-95a7-4d25-a2e5-5935a961a0d1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:15 crc kubenswrapper[4735]: I0317 04:22:15.819392 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-multi-thread-testing_07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d/tempest-tests-tempest-tests-runner/0.log" Mar 17 04:22:16 crc kubenswrapper[4735]: I0317 04:22:16.015668 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-thread-testing_db53fa15-c77f-4396-aaae-d0110e90ddb6/tempest-tests-tempest-tests-runner/0.log" Mar 17 04:22:16 crc kubenswrapper[4735]: I0317 04:22:16.109227 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c0b1008b-df7f-4c35-96c5-ddb5629af0f4/test-operator-logs-container/0.log" Mar 17 04:22:16 crc kubenswrapper[4735]: I0317 04:22:16.282010 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5_979aea20-7a48-4bff-9188-685c664c6a78/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:22:23 crc kubenswrapper[4735]: I0317 04:22:23.073190 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:22:23 crc kubenswrapper[4735]: E0317 04:22:23.073873 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:22:23 crc kubenswrapper[4735]: I0317 04:22:23.899791 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a6d7635e-1b43-4bd6-af6d-234b9502545e/memcached/0.log" Mar 17 04:22:34 crc kubenswrapper[4735]: I0317 04:22:34.073637 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:22:34 crc kubenswrapper[4735]: E0317 04:22:34.074943 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:22:45 crc kubenswrapper[4735]: I0317 04:22:45.123377 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:22:45 crc kubenswrapper[4735]: I0317 04:22:45.615748 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:22:45 crc kubenswrapper[4735]: I0317 04:22:45.623500 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:22:45 crc kubenswrapper[4735]: I0317 04:22:45.675785 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.031982 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.044000 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/extract/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.070264 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.073847 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:22:46 crc kubenswrapper[4735]: E0317 04:22:46.074171 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.483989 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-wtjk5_33c29a1c-b6e1-4b71-a0ed-b9a8851a0558/manager/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.630285 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-mrkzz_4b767f40-0d53-4067-a546-0f14da7659bc/manager/0.log" Mar 17 04:22:46 crc kubenswrapper[4735]: I0317 04:22:46.946353 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-2tj58_da31620d-afd3-4129-a12b-bfddaead4abd/manager/0.log" Mar 17 04:22:47 crc kubenswrapper[4735]: I0317 04:22:47.224149 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2gnd8_250680a5-b697-4c2b-9180-3919204f246e/manager/0.log" Mar 17 04:22:47 crc kubenswrapper[4735]: I0317 04:22:47.361005 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-r4z8g_a628f7da-d487-43c7-9965-5697505667fb/manager/0.log" Mar 17 04:22:47 crc kubenswrapper[4735]: I0317 04:22:47.788944 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-t6nkw_8c6fa0c7-2f56-4498-87ee-7ae0f64f262e/manager/0.log" Mar 17 04:22:47 crc kubenswrapper[4735]: I0317 04:22:47.871092 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-v78gc_5fc876da-b6d0-4c8b-ab2e-84558e5ba079/manager/0.log" Mar 17 04:22:48 crc kubenswrapper[4735]: I0317 04:22:48.168069 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-vm55l_b596dfa6-ef2c-4c3c-80fb-f18229f7b99f/manager/0.log" Mar 17 04:22:48 crc kubenswrapper[4735]: I0317 04:22:48.315130 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-7cjd9_9c858fbe-58b8-4dea-91e5-05366d1bd648/manager/0.log" Mar 17 04:22:48 crc kubenswrapper[4735]: I0317 04:22:48.609392 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-gwwjh_4701b52d-3086-4792-bc35-c51cf4d63ad8/manager/0.log" Mar 17 04:22:48 crc kubenswrapper[4735]: I0317 04:22:48.879427 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-4dmmd_6306f8c1-066b-46e3-b76c-5490680c0ae3/manager/0.log" Mar 17 04:22:48 crc kubenswrapper[4735]: I0317 04:22:48.981768 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dsd6v_37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575/manager/0.log" Mar 17 04:22:49 crc kubenswrapper[4735]: I0317 04:22:49.059556 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-lhsm8_486810ee-5bdf-451a-bc69-179723bbe75d/manager/0.log" Mar 17 04:22:49 crc kubenswrapper[4735]: I0317 04:22:49.523413 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-fwv89_77baae0d-2b5c-4b05-ab71-c87259ef645a/manager/0.log" Mar 17 04:22:49 crc kubenswrapper[4735]: I0317 04:22:49.535251 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-kklq4_eefbeadd-f37e-4419-8d66-ba1731016fd0/manager/0.log" Mar 17 04:22:49 crc kubenswrapper[4735]: I0317 04:22:49.998166 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6597c75466-b4kfj_f640bb25-8c1e-4718-9f44-dec9ab10fbb9/operator/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.125958 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-djqxj_0b8682ae-cefe-457b-b2b2-76753bc1db5f/registry-server/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.510366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-t8wv8_1f207812-4855-4e6c-9c7d-64d45ac3c917/manager/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.518005 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-257t8_ae2a2402-a926-4331-aed4-5b25fc55b9ba/manager/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.795412 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5brsc_c86b3820-e6e8-41e2-85f4-69a8fda476a2/operator/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.831633 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-v2knm_4ffd9139-49ec-4622-bed5-0744011e0469/manager/0.log" Mar 17 04:22:50 crc kubenswrapper[4735]: I0317 04:22:50.949539 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-576dc457f-z87tj_9a63ed64-5e87-4f3d-8568-284237818e90/manager/0.log" Mar 17 04:22:51 crc kubenswrapper[4735]: I0317 04:22:51.058594 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-wnncr_e786fada-7145-4c03-a0bb-073321237c38/manager/0.log" Mar 17 04:22:51 crc kubenswrapper[4735]: I0317 04:22:51.113180 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-p9zdz_9e4382a5-eb49-4a6b-8ee0-4692cad88ae7/manager/0.log" Mar 17 04:22:51 crc kubenswrapper[4735]: I0317 04:22:51.264675 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-txd6q_f6c692fb-2aa5-47c4-8036-265ad9d63131/manager/0.log" Mar 17 04:22:59 crc kubenswrapper[4735]: I0317 04:22:59.072958 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:22:59 crc kubenswrapper[4735]: E0317 04:22:59.073681 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:23:14 crc kubenswrapper[4735]: I0317 04:23:14.030685 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hsd5h_76c60596-d8ac-452c-bf33-b35157ee0970/control-plane-machine-set-operator/0.log" Mar 17 04:23:14 crc kubenswrapper[4735]: I0317 04:23:14.072817 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:23:14 crc kubenswrapper[4735]: E0317 04:23:14.073814 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:23:14 crc kubenswrapper[4735]: I0317 04:23:14.198638 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqdp7_d1d16522-3b99-4ee4-b1ae-901b135c661d/kube-rbac-proxy/0.log" Mar 17 04:23:14 crc kubenswrapper[4735]: I0317 04:23:14.235638 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqdp7_d1d16522-3b99-4ee4-b1ae-901b135c661d/machine-api-operator/0.log" Mar 17 04:23:27 crc kubenswrapper[4735]: I0317 04:23:27.073900 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:23:27 crc kubenswrapper[4735]: E0317 04:23:27.074530 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:23:28 crc kubenswrapper[4735]: I0317 04:23:28.870450 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4t98c_a4d0b9dc-dc40-431d-9fc3-8a45378faaf9/cert-manager-controller/0.log" Mar 17 04:23:29 crc kubenswrapper[4735]: I0317 04:23:29.119575 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5g5gl_f026bd7d-4093-433f-b42a-e2f88dbd2c7f/cert-manager-cainjector/0.log" Mar 17 04:23:29 crc kubenswrapper[4735]: I0317 04:23:29.178267 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-g9frk_2c5a87ba-17e0-4807-980f-f42af0ffb51a/cert-manager-webhook/0.log" Mar 17 04:23:40 crc kubenswrapper[4735]: I0317 04:23:40.073646 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:23:40 crc kubenswrapper[4735]: E0317 04:23:40.074274 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.374786 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-4jtvs_dfa311dd-f771-40b1-9b71-0dd3f2e09ba6/nmstate-console-plugin/0.log" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.620628 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qvt46_5f46807e-1877-476e-aeb0-7e1acfe206da/nmstate-handler/0.log" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.645511 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tl9ds_1a943747-e426-437c-a203-7d326d2b1cc1/kube-rbac-proxy/0.log" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.704077 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tl9ds_1a943747-e426-437c-a203-7d326d2b1cc1/nmstate-metrics/0.log" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.914005 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-6h9tn_a2a8ea5f-24b7-4ba0-a197-9f9700436a0e/nmstate-operator/0.log" Mar 17 04:23:44 crc kubenswrapper[4735]: I0317 04:23:44.986456 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rbclm_b4c72542-813d-4029-a6ef-f76feb3f6459/nmstate-webhook/0.log" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.102180 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:23:47 crc kubenswrapper[4735]: E0317 04:23:47.102889 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dc698c-8baa-46a7-ab60-936c016790b9" containerName="oc" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.102926 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dc698c-8baa-46a7-ab60-936c016790b9" containerName="oc" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.103160 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="02dc698c-8baa-46a7-ab60-936c016790b9" containerName="oc" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.104540 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.121274 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.207799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x25p2\" (UniqueName: \"kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.208133 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.208286 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.309537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.309633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x25p2\" (UniqueName: \"kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.309655 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.310653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.311003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.367637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x25p2\" (UniqueName: \"kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2\") pod \"certified-operators-lt495\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:47 crc kubenswrapper[4735]: I0317 04:23:47.440166 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:48 crc kubenswrapper[4735]: I0317 04:23:48.474220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:23:49 crc kubenswrapper[4735]: I0317 04:23:49.271135 4735 generic.go:334] "Generic (PLEG): container finished" podID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerID="c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b" exitCode=0 Mar 17 04:23:49 crc kubenswrapper[4735]: I0317 04:23:49.271331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerDied","Data":"c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b"} Mar 17 04:23:49 crc kubenswrapper[4735]: I0317 04:23:49.271373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerStarted","Data":"9adfdcaa515d6ffbfecb7bf67561c837a6fcdbb73ab244388c8403220fa642c6"} Mar 17 04:23:50 crc kubenswrapper[4735]: I0317 04:23:50.280844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerStarted","Data":"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4"} Mar 17 04:23:51 crc kubenswrapper[4735]: I0317 04:23:51.074280 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:23:51 crc kubenswrapper[4735]: E0317 04:23:51.074769 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:23:52 crc kubenswrapper[4735]: I0317 04:23:52.308610 4735 generic.go:334] "Generic (PLEG): container finished" podID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerID="a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4" exitCode=0 Mar 17 04:23:52 crc kubenswrapper[4735]: I0317 04:23:52.308675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerDied","Data":"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4"} Mar 17 04:23:53 crc kubenswrapper[4735]: I0317 04:23:53.321046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerStarted","Data":"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70"} Mar 17 04:23:53 crc kubenswrapper[4735]: I0317 04:23:53.344411 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lt495" podStartSLOduration=2.682917422 podStartE2EDuration="6.344395233s" podCreationTimestamp="2026-03-17 04:23:47 +0000 UTC" firstStartedPulling="2026-03-17 04:23:49.272750719 +0000 UTC m=+11654.904983697" lastFinishedPulling="2026-03-17 04:23:52.93422853 +0000 UTC m=+11658.566461508" observedRunningTime="2026-03-17 04:23:53.336057971 +0000 UTC m=+11658.968290939" watchObservedRunningTime="2026-03-17 04:23:53.344395233 +0000 UTC m=+11658.976628211" Mar 17 04:23:57 crc kubenswrapper[4735]: I0317 04:23:57.442692 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:57 crc kubenswrapper[4735]: I0317 04:23:57.443111 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:23:58 crc kubenswrapper[4735]: I0317 04:23:58.522585 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lt495" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="registry-server" probeResult="failure" output=< Mar 17 04:23:58 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:23:58 crc kubenswrapper[4735]: > Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.170735 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562024-tt9xn"] Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.172962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.175241 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.175469 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.175601 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.190291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562024-tt9xn"] Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.373655 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvd8\" (UniqueName: \"kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8\") pod \"auto-csr-approver-29562024-tt9xn\" (UID: \"d6f01206-a188-41e6-87ec-3f7e3966c641\") " pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.475128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvd8\" (UniqueName: \"kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8\") pod \"auto-csr-approver-29562024-tt9xn\" (UID: \"d6f01206-a188-41e6-87ec-3f7e3966c641\") " pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.512357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvd8\" (UniqueName: \"kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8\") pod \"auto-csr-approver-29562024-tt9xn\" (UID: \"d6f01206-a188-41e6-87ec-3f7e3966c641\") " pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:00 crc kubenswrapper[4735]: I0317 04:24:00.791373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:01 crc kubenswrapper[4735]: I0317 04:24:01.368334 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562024-tt9xn"] Mar 17 04:24:01 crc kubenswrapper[4735]: I0317 04:24:01.400737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" event={"ID":"d6f01206-a188-41e6-87ec-3f7e3966c641","Type":"ContainerStarted","Data":"5ad0e6efba584add21d7a60eb09e8acf8e9de07af54c8ee3a2500510169ffc10"} Mar 17 04:24:03 crc kubenswrapper[4735]: I0317 04:24:03.423491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" event={"ID":"d6f01206-a188-41e6-87ec-3f7e3966c641","Type":"ContainerStarted","Data":"0463ad9c97229910e154383c179d91d5cefdb04eef267f441a39b3882ebc8343"} Mar 17 04:24:03 crc kubenswrapper[4735]: I0317 04:24:03.442958 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" podStartSLOduration=2.607125209 podStartE2EDuration="3.442942911s" podCreationTimestamp="2026-03-17 04:24:00 +0000 UTC" firstStartedPulling="2026-03-17 04:24:01.376638829 +0000 UTC m=+11667.008871807" lastFinishedPulling="2026-03-17 04:24:02.212456541 +0000 UTC m=+11667.844689509" observedRunningTime="2026-03-17 04:24:03.43754794 +0000 UTC m=+11669.069780918" watchObservedRunningTime="2026-03-17 04:24:03.442942911 +0000 UTC m=+11669.075175889" Mar 17 04:24:04 crc kubenswrapper[4735]: I0317 04:24:04.439185 4735 generic.go:334] "Generic (PLEG): container finished" podID="d6f01206-a188-41e6-87ec-3f7e3966c641" containerID="0463ad9c97229910e154383c179d91d5cefdb04eef267f441a39b3882ebc8343" exitCode=0 Mar 17 04:24:04 crc kubenswrapper[4735]: I0317 04:24:04.439468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" event={"ID":"d6f01206-a188-41e6-87ec-3f7e3966c641","Type":"ContainerDied","Data":"0463ad9c97229910e154383c179d91d5cefdb04eef267f441a39b3882ebc8343"} Mar 17 04:24:05 crc kubenswrapper[4735]: I0317 04:24:05.080646 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:24:05 crc kubenswrapper[4735]: E0317 04:24:05.081052 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:24:05 crc kubenswrapper[4735]: I0317 04:24:05.801940 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:05 crc kubenswrapper[4735]: I0317 04:24:05.999010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dvd8\" (UniqueName: \"kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8\") pod \"d6f01206-a188-41e6-87ec-3f7e3966c641\" (UID: \"d6f01206-a188-41e6-87ec-3f7e3966c641\") " Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.013354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8" (OuterVolumeSpecName: "kube-api-access-2dvd8") pod "d6f01206-a188-41e6-87ec-3f7e3966c641" (UID: "d6f01206-a188-41e6-87ec-3f7e3966c641"). InnerVolumeSpecName "kube-api-access-2dvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.101505 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dvd8\" (UniqueName: \"kubernetes.io/projected/d6f01206-a188-41e6-87ec-3f7e3966c641-kube-api-access-2dvd8\") on node \"crc\" DevicePath \"\"" Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.454046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" event={"ID":"d6f01206-a188-41e6-87ec-3f7e3966c641","Type":"ContainerDied","Data":"5ad0e6efba584add21d7a60eb09e8acf8e9de07af54c8ee3a2500510169ffc10"} Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.454416 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad0e6efba584add21d7a60eb09e8acf8e9de07af54c8ee3a2500510169ffc10" Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.454289 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562024-tt9xn" Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.523398 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562018-2qkzs"] Mar 17 04:24:06 crc kubenswrapper[4735]: I0317 04:24:06.533201 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562018-2qkzs"] Mar 17 04:24:07 crc kubenswrapper[4735]: I0317 04:24:07.085503 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168cd015-df53-47df-9187-fcdba9e11bc8" path="/var/lib/kubelet/pods/168cd015-df53-47df-9187-fcdba9e11bc8/volumes" Mar 17 04:24:07 crc kubenswrapper[4735]: I0317 04:24:07.500031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:24:07 crc kubenswrapper[4735]: I0317 04:24:07.554539 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:24:07 crc kubenswrapper[4735]: I0317 04:24:07.737798 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:24:09 crc kubenswrapper[4735]: I0317 04:24:09.492167 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lt495" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="registry-server" containerID="cri-o://6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70" gracePeriod=2 Mar 17 04:24:09 crc kubenswrapper[4735]: I0317 04:24:09.987388 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.071524 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities\") pod \"56f7d486-3a40-49d5-8de1-eced7be397fe\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.071638 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content\") pod \"56f7d486-3a40-49d5-8de1-eced7be397fe\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.071794 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x25p2\" (UniqueName: \"kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2\") pod \"56f7d486-3a40-49d5-8de1-eced7be397fe\" (UID: \"56f7d486-3a40-49d5-8de1-eced7be397fe\") " Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.076621 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities" (OuterVolumeSpecName: "utilities") pod "56f7d486-3a40-49d5-8de1-eced7be397fe" (UID: "56f7d486-3a40-49d5-8de1-eced7be397fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.159014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2" (OuterVolumeSpecName: "kube-api-access-x25p2") pod "56f7d486-3a40-49d5-8de1-eced7be397fe" (UID: "56f7d486-3a40-49d5-8de1-eced7be397fe"). InnerVolumeSpecName "kube-api-access-x25p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.183774 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.183798 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x25p2\" (UniqueName: \"kubernetes.io/projected/56f7d486-3a40-49d5-8de1-eced7be397fe-kube-api-access-x25p2\") on node \"crc\" DevicePath \"\"" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.216049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56f7d486-3a40-49d5-8de1-eced7be397fe" (UID: "56f7d486-3a40-49d5-8de1-eced7be397fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.286108 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f7d486-3a40-49d5-8de1-eced7be397fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.499321 4735 generic.go:334] "Generic (PLEG): container finished" podID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerID="6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70" exitCode=0 Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.499368 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerDied","Data":"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70"} Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.499444 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt495" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.500068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt495" event={"ID":"56f7d486-3a40-49d5-8de1-eced7be397fe","Type":"ContainerDied","Data":"9adfdcaa515d6ffbfecb7bf67561c837a6fcdbb73ab244388c8403220fa642c6"} Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.500110 4735 scope.go:117] "RemoveContainer" containerID="6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.526858 4735 scope.go:117] "RemoveContainer" containerID="a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.545939 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.556999 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lt495"] Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.569904 4735 scope.go:117] "RemoveContainer" containerID="c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.604476 4735 scope.go:117] "RemoveContainer" containerID="6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70" Mar 17 04:24:10 crc kubenswrapper[4735]: E0317 04:24:10.609277 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70\": container with ID starting with 6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70 not found: ID does not exist" containerID="6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.609322 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70"} err="failed to get container status \"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70\": rpc error: code = NotFound desc = could not find container \"6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70\": container with ID starting with 6e5df5d6657d9b0f1ea8bdb92db51a3294cb498c355ceea165f63b85be572b70 not found: ID does not exist" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.609347 4735 scope.go:117] "RemoveContainer" containerID="a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4" Mar 17 04:24:10 crc kubenswrapper[4735]: E0317 04:24:10.609803 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4\": container with ID starting with a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4 not found: ID does not exist" containerID="a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.609828 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4"} err="failed to get container status \"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4\": rpc error: code = NotFound desc = could not find container \"a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4\": container with ID starting with a30c3dd9b8dfd849f5e270ef2fb6e897478398f85878886496b0dc69dfd22fb4 not found: ID does not exist" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.609849 4735 scope.go:117] "RemoveContainer" containerID="c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b" Mar 17 04:24:10 crc kubenswrapper[4735]: E0317 04:24:10.610197 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b\": container with ID starting with c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b not found: ID does not exist" containerID="c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b" Mar 17 04:24:10 crc kubenswrapper[4735]: I0317 04:24:10.610222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b"} err="failed to get container status \"c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b\": rpc error: code = NotFound desc = could not find container \"c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b\": container with ID starting with c672cced306bb6e24bbbcf13100811b0db88d9c4bb7664ef45e8648b9ecd494b not found: ID does not exist" Mar 17 04:24:11 crc kubenswrapper[4735]: I0317 04:24:11.084538 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" path="/var/lib/kubelet/pods/56f7d486-3a40-49d5-8de1-eced7be397fe/volumes" Mar 17 04:24:12 crc kubenswrapper[4735]: I0317 04:24:12.668940 4735 scope.go:117] "RemoveContainer" containerID="be6cf30276bf5d376434e74ef18e02d9de7039443fcf8a6cb813d768f62f7a67" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.073306 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:24:18 crc kubenswrapper[4735]: E0317 04:24:18.073997 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.301689 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-788lk_1edf1039-ebea-4804-9f30-6844633b7919/kube-rbac-proxy/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.457957 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-788lk_1edf1039-ebea-4804-9f30-6844633b7919/controller/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.609528 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.904373 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.916125 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.941075 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:24:18 crc kubenswrapper[4735]: I0317 04:24:18.984417 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.148521 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.186217 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.274238 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.289333 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.398410 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.408198 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.454392 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.510278 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/controller/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.601620 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/frr-metrics/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.727960 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/kube-rbac-proxy-frr/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.763067 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/kube-rbac-proxy/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.925132 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/reloader/0.log" Mar 17 04:24:19 crc kubenswrapper[4735]: I0317 04:24:19.988241 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-c667b_665a221c-0d9f-4dfd-888a-fc7d5f09fbdb/frr-k8s-webhook-server/0.log" Mar 17 04:24:20 crc kubenswrapper[4735]: I0317 04:24:20.337321 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5bffff7ccd-ss6s5_81e7afb5-02be-49b0-bd12-39b2b2346a93/manager/0.log" Mar 17 04:24:20 crc kubenswrapper[4735]: I0317 04:24:20.462205 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b5d585c89-bjzst_1841e816-4298-4c01-8bfa-07273ea8dfff/webhook-server/0.log" Mar 17 04:24:20 crc kubenswrapper[4735]: I0317 04:24:20.747968 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-slwx6_3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e/kube-rbac-proxy/0.log" Mar 17 04:24:21 crc kubenswrapper[4735]: I0317 04:24:21.348370 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-slwx6_3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e/speaker/0.log" Mar 17 04:24:22 crc kubenswrapper[4735]: I0317 04:24:22.572306 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/frr/0.log" Mar 17 04:24:30 crc kubenswrapper[4735]: I0317 04:24:30.072864 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:24:30 crc kubenswrapper[4735]: E0317 04:24:30.073864 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.277137 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.475910 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.512423 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.529417 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.700674 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.735905 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.745126 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/extract/0.log" Mar 17 04:24:35 crc kubenswrapper[4735]: I0317 04:24:35.909315 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.066470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.093600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.112713 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.314428 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/extract/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.314477 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.327783 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.502319 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.649372 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.670991 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.698581 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.939705 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:24:36 crc kubenswrapper[4735]: I0317 04:24:36.987850 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:24:37 crc kubenswrapper[4735]: I0317 04:24:37.504356 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:24:37 crc kubenswrapper[4735]: I0317 04:24:37.766591 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:24:37 crc kubenswrapper[4735]: I0317 04:24:37.788315 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:24:37 crc kubenswrapper[4735]: I0317 04:24:37.802600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.023645 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/registry-server/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.090250 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.147974 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.413850 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/registry-server/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.420065 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qkds_f56c71b2-cf3c-4fe1-8c13-fd905c5a623d/marketplace-operator/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.459364 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:24:38 crc kubenswrapper[4735]: E0317 04:24:38.459837 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f01206-a188-41e6-87ec-3f7e3966c641" containerName="oc" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.459850 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f01206-a188-41e6-87ec-3f7e3966c641" containerName="oc" Mar 17 04:24:38 crc kubenswrapper[4735]: E0317 04:24:38.459876 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="extract-utilities" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.459883 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="extract-utilities" Mar 17 04:24:38 crc kubenswrapper[4735]: E0317 04:24:38.459901 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="registry-server" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.459908 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="registry-server" Mar 17 04:24:38 crc kubenswrapper[4735]: E0317 04:24:38.459921 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="extract-content" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.459926 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="extract-content" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.460112 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f7d486-3a40-49d5-8de1-eced7be397fe" containerName="registry-server" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.460125 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f01206-a188-41e6-87ec-3f7e3966c641" containerName="oc" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.461490 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.471325 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.579931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rx2\" (UniqueName: \"kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.580017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.580039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.678044 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.681156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rx2\" (UniqueName: \"kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.681241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.681266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.681737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.681971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.702303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rx2\" (UniqueName: \"kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2\") pod \"redhat-operators-jgckb\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.787369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.826516 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.877121 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:24:38 crc kubenswrapper[4735]: I0317 04:24:38.924914 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.190608 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.321260 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.347016 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.476065 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.626519 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/registry-server/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.755674 4735 generic.go:334] "Generic (PLEG): container finished" podID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerID="a39dafe5970580673865e7cfa6a432b42485ca174dc9558ef55fe33438137c37" exitCode=0 Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.755719 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerDied","Data":"a39dafe5970580673865e7cfa6a432b42485ca174dc9558ef55fe33438137c37"} Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.755752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerStarted","Data":"d60d621ba2ebea2d29b4570cd8fd50b35dba3c00cd57e93ff93d569df44556b0"} Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.773010 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.818833 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:24:39 crc kubenswrapper[4735]: I0317 04:24:39.841116 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:24:40 crc kubenswrapper[4735]: I0317 04:24:40.022788 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:24:40 crc kubenswrapper[4735]: I0317 04:24:40.060281 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:24:40 crc kubenswrapper[4735]: I0317 04:24:40.765714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerStarted","Data":"98a6d0a042358207dbcafdc6ceb9d1a05414d39d7811d8fb27ab0019be352b73"} Mar 17 04:24:41 crc kubenswrapper[4735]: I0317 04:24:41.182175 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/registry-server/0.log" Mar 17 04:24:44 crc kubenswrapper[4735]: I0317 04:24:44.073404 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:24:44 crc kubenswrapper[4735]: E0317 04:24:44.074094 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:24:46 crc kubenswrapper[4735]: I0317 04:24:46.830943 4735 generic.go:334] "Generic (PLEG): container finished" podID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerID="98a6d0a042358207dbcafdc6ceb9d1a05414d39d7811d8fb27ab0019be352b73" exitCode=0 Mar 17 04:24:46 crc kubenswrapper[4735]: I0317 04:24:46.831154 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerDied","Data":"98a6d0a042358207dbcafdc6ceb9d1a05414d39d7811d8fb27ab0019be352b73"} Mar 17 04:24:47 crc kubenswrapper[4735]: I0317 04:24:47.843229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerStarted","Data":"70a656cbf506c413f9de92ed65c5e7f0b60789d4903a82ea4b02b5ccc6f158d6"} Mar 17 04:24:47 crc kubenswrapper[4735]: I0317 04:24:47.860784 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jgckb" podStartSLOduration=2.29486596 podStartE2EDuration="9.860766115s" podCreationTimestamp="2026-03-17 04:24:38 +0000 UTC" firstStartedPulling="2026-03-17 04:24:39.757291522 +0000 UTC m=+11705.389524500" lastFinishedPulling="2026-03-17 04:24:47.323191667 +0000 UTC m=+11712.955424655" observedRunningTime="2026-03-17 04:24:47.859847593 +0000 UTC m=+11713.492080571" watchObservedRunningTime="2026-03-17 04:24:47.860766115 +0000 UTC m=+11713.492999113" Mar 17 04:24:48 crc kubenswrapper[4735]: I0317 04:24:48.789098 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:48 crc kubenswrapper[4735]: I0317 04:24:48.789372 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:24:49 crc kubenswrapper[4735]: I0317 04:24:49.833821 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" probeResult="failure" output=< Mar 17 04:24:49 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:24:49 crc kubenswrapper[4735]: > Mar 17 04:24:59 crc kubenswrapper[4735]: I0317 04:24:59.073116 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:24:59 crc kubenswrapper[4735]: E0317 04:24:59.073748 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:24:59 crc kubenswrapper[4735]: I0317 04:24:59.882155 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" probeResult="failure" output=< Mar 17 04:24:59 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:24:59 crc kubenswrapper[4735]: > Mar 17 04:25:09 crc kubenswrapper[4735]: I0317 04:25:09.857970 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" probeResult="failure" output=< Mar 17 04:25:09 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:25:09 crc kubenswrapper[4735]: > Mar 17 04:25:13 crc kubenswrapper[4735]: I0317 04:25:13.074205 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:25:14 crc kubenswrapper[4735]: I0317 04:25:14.079101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9"} Mar 17 04:25:19 crc kubenswrapper[4735]: I0317 04:25:19.835258 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" probeResult="failure" output=< Mar 17 04:25:19 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:25:19 crc kubenswrapper[4735]: > Mar 17 04:25:29 crc kubenswrapper[4735]: I0317 04:25:29.843495 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" probeResult="failure" output=< Mar 17 04:25:29 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:25:29 crc kubenswrapper[4735]: > Mar 17 04:25:38 crc kubenswrapper[4735]: I0317 04:25:38.890996 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:25:38 crc kubenswrapper[4735]: I0317 04:25:38.970652 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:25:39 crc kubenswrapper[4735]: I0317 04:25:39.736555 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:25:40 crc kubenswrapper[4735]: I0317 04:25:40.330446 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jgckb" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" containerID="cri-o://70a656cbf506c413f9de92ed65c5e7f0b60789d4903a82ea4b02b5ccc6f158d6" gracePeriod=2 Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.361094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerDied","Data":"70a656cbf506c413f9de92ed65c5e7f0b60789d4903a82ea4b02b5ccc6f158d6"} Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.361108 4735 generic.go:334] "Generic (PLEG): container finished" podID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerID="70a656cbf506c413f9de92ed65c5e7f0b60789d4903a82ea4b02b5ccc6f158d6" exitCode=0 Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.549818 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.728505 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rx2\" (UniqueName: \"kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2\") pod \"e59de944-16ae-4f19-bdf5-8ce797621ec5\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.728731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities\") pod \"e59de944-16ae-4f19-bdf5-8ce797621ec5\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.728810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content\") pod \"e59de944-16ae-4f19-bdf5-8ce797621ec5\" (UID: \"e59de944-16ae-4f19-bdf5-8ce797621ec5\") " Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.730150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities" (OuterVolumeSpecName: "utilities") pod "e59de944-16ae-4f19-bdf5-8ce797621ec5" (UID: "e59de944-16ae-4f19-bdf5-8ce797621ec5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.736384 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2" (OuterVolumeSpecName: "kube-api-access-k7rx2") pod "e59de944-16ae-4f19-bdf5-8ce797621ec5" (UID: "e59de944-16ae-4f19-bdf5-8ce797621ec5"). InnerVolumeSpecName "kube-api-access-k7rx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.831249 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rx2\" (UniqueName: \"kubernetes.io/projected/e59de944-16ae-4f19-bdf5-8ce797621ec5-kube-api-access-k7rx2\") on node \"crc\" DevicePath \"\"" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.831282 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.923738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e59de944-16ae-4f19-bdf5-8ce797621ec5" (UID: "e59de944-16ae-4f19-bdf5-8ce797621ec5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:25:41 crc kubenswrapper[4735]: I0317 04:25:41.933268 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59de944-16ae-4f19-bdf5-8ce797621ec5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.379573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgckb" event={"ID":"e59de944-16ae-4f19-bdf5-8ce797621ec5","Type":"ContainerDied","Data":"d60d621ba2ebea2d29b4570cd8fd50b35dba3c00cd57e93ff93d569df44556b0"} Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.379629 4735 scope.go:117] "RemoveContainer" containerID="70a656cbf506c413f9de92ed65c5e7f0b60789d4903a82ea4b02b5ccc6f158d6" Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.379664 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgckb" Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.436978 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.444151 4735 scope.go:117] "RemoveContainer" containerID="98a6d0a042358207dbcafdc6ceb9d1a05414d39d7811d8fb27ab0019be352b73" Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.446091 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jgckb"] Mar 17 04:25:42 crc kubenswrapper[4735]: I0317 04:25:42.469795 4735 scope.go:117] "RemoveContainer" containerID="a39dafe5970580673865e7cfa6a432b42485ca174dc9558ef55fe33438137c37" Mar 17 04:25:43 crc kubenswrapper[4735]: I0317 04:25:43.091232 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" path="/var/lib/kubelet/pods/e59de944-16ae-4f19-bdf5-8ce797621ec5/volumes" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.184388 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562026-vlzvh"] Mar 17 04:26:00 crc kubenswrapper[4735]: E0317 04:26:00.188661 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="extract-content" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.188821 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="extract-content" Mar 17 04:26:00 crc kubenswrapper[4735]: E0317 04:26:00.188951 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="extract-utilities" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.189030 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="extract-utilities" Mar 17 04:26:00 crc kubenswrapper[4735]: E0317 04:26:00.189122 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.189196 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.189692 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59de944-16ae-4f19-bdf5-8ce797621ec5" containerName="registry-server" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.193376 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.208350 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562026-vlzvh"] Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.214154 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.214803 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.214827 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.302016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgx5\" (UniqueName: \"kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5\") pod \"auto-csr-approver-29562026-vlzvh\" (UID: \"72300078-c045-4609-a04b-980cf6e17df9\") " pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.404413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgx5\" (UniqueName: \"kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5\") pod \"auto-csr-approver-29562026-vlzvh\" (UID: \"72300078-c045-4609-a04b-980cf6e17df9\") " pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.430594 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgx5\" (UniqueName: \"kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5\") pod \"auto-csr-approver-29562026-vlzvh\" (UID: \"72300078-c045-4609-a04b-980cf6e17df9\") " pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:00 crc kubenswrapper[4735]: I0317 04:26:00.517909 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:01 crc kubenswrapper[4735]: I0317 04:26:01.087677 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:26:01 crc kubenswrapper[4735]: I0317 04:26:01.097440 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562026-vlzvh"] Mar 17 04:26:01 crc kubenswrapper[4735]: I0317 04:26:01.614064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" event={"ID":"72300078-c045-4609-a04b-980cf6e17df9","Type":"ContainerStarted","Data":"28bd8ba5e71da5b648cf34d548ae72e1f70b0f571cbe33031266442514eced7e"} Mar 17 04:26:03 crc kubenswrapper[4735]: I0317 04:26:03.635112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" event={"ID":"72300078-c045-4609-a04b-980cf6e17df9","Type":"ContainerStarted","Data":"2388307c3fe38c2a8b4f01dcbbefd2beccb29ec0562ede7ace196cd5bc9b06c7"} Mar 17 04:26:03 crc kubenswrapper[4735]: I0317 04:26:03.671516 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" podStartSLOduration=2.837549591 podStartE2EDuration="3.671405667s" podCreationTimestamp="2026-03-17 04:26:00 +0000 UTC" firstStartedPulling="2026-03-17 04:26:01.083206923 +0000 UTC m=+11786.715439911" lastFinishedPulling="2026-03-17 04:26:01.917063009 +0000 UTC m=+11787.549295987" observedRunningTime="2026-03-17 04:26:03.658111146 +0000 UTC m=+11789.290344164" watchObservedRunningTime="2026-03-17 04:26:03.671405667 +0000 UTC m=+11789.303638645" Mar 17 04:26:04 crc kubenswrapper[4735]: I0317 04:26:04.648892 4735 generic.go:334] "Generic (PLEG): container finished" podID="72300078-c045-4609-a04b-980cf6e17df9" containerID="2388307c3fe38c2a8b4f01dcbbefd2beccb29ec0562ede7ace196cd5bc9b06c7" exitCode=0 Mar 17 04:26:04 crc kubenswrapper[4735]: I0317 04:26:04.648942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" event={"ID":"72300078-c045-4609-a04b-980cf6e17df9","Type":"ContainerDied","Data":"2388307c3fe38c2a8b4f01dcbbefd2beccb29ec0562ede7ace196cd5bc9b06c7"} Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.047405 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.146130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvgx5\" (UniqueName: \"kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5\") pod \"72300078-c045-4609-a04b-980cf6e17df9\" (UID: \"72300078-c045-4609-a04b-980cf6e17df9\") " Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.157083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5" (OuterVolumeSpecName: "kube-api-access-fvgx5") pod "72300078-c045-4609-a04b-980cf6e17df9" (UID: "72300078-c045-4609-a04b-980cf6e17df9"). InnerVolumeSpecName "kube-api-access-fvgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.248781 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvgx5\" (UniqueName: \"kubernetes.io/projected/72300078-c045-4609-a04b-980cf6e17df9-kube-api-access-fvgx5\") on node \"crc\" DevicePath \"\"" Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.672179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" event={"ID":"72300078-c045-4609-a04b-980cf6e17df9","Type":"ContainerDied","Data":"28bd8ba5e71da5b648cf34d548ae72e1f70b0f571cbe33031266442514eced7e"} Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.672699 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28bd8ba5e71da5b648cf34d548ae72e1f70b0f571cbe33031266442514eced7e" Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.672260 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562026-vlzvh" Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.772386 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562020-k5rlj"] Mar 17 04:26:06 crc kubenswrapper[4735]: I0317 04:26:06.785733 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562020-k5rlj"] Mar 17 04:26:07 crc kubenswrapper[4735]: I0317 04:26:07.082874 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73722a66-1ec7-4810-9026-94373db1e929" path="/var/lib/kubelet/pods/73722a66-1ec7-4810-9026-94373db1e929/volumes" Mar 17 04:26:12 crc kubenswrapper[4735]: I0317 04:26:12.843527 4735 scope.go:117] "RemoveContainer" containerID="5a65265b3993eb32238489e4075b3293e3f5cd370291127819b098c83f309fc6" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.096429 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:26:37 crc kubenswrapper[4735]: E0317 04:26:37.097465 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72300078-c045-4609-a04b-980cf6e17df9" containerName="oc" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.097479 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="72300078-c045-4609-a04b-980cf6e17df9" containerName="oc" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.097660 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="72300078-c045-4609-a04b-980cf6e17df9" containerName="oc" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.100307 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.115323 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.240707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.241158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.241502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt6dp\" (UniqueName: \"kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.343433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.343558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt6dp\" (UniqueName: \"kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.343635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.343907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.344105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.381527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt6dp\" (UniqueName: \"kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp\") pod \"community-operators-v92jx\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.418527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:37 crc kubenswrapper[4735]: I0317 04:26:37.941035 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:26:38 crc kubenswrapper[4735]: I0317 04:26:38.010843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerStarted","Data":"0dc8bf9fef6c5ff538b407787ed511069649077af5ded16b59547654e229f328"} Mar 17 04:26:39 crc kubenswrapper[4735]: I0317 04:26:39.040643 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerID="d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581" exitCode=0 Mar 17 04:26:39 crc kubenswrapper[4735]: I0317 04:26:39.040830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerDied","Data":"d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581"} Mar 17 04:26:40 crc kubenswrapper[4735]: I0317 04:26:40.055033 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerStarted","Data":"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304"} Mar 17 04:26:42 crc kubenswrapper[4735]: I0317 04:26:42.075597 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerID="fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304" exitCode=0 Mar 17 04:26:42 crc kubenswrapper[4735]: I0317 04:26:42.075672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerDied","Data":"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304"} Mar 17 04:26:43 crc kubenswrapper[4735]: I0317 04:26:43.123987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerStarted","Data":"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e"} Mar 17 04:26:43 crc kubenswrapper[4735]: I0317 04:26:43.161624 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v92jx" podStartSLOduration=2.722645467 podStartE2EDuration="6.161604783s" podCreationTimestamp="2026-03-17 04:26:37 +0000 UTC" firstStartedPulling="2026-03-17 04:26:39.046231984 +0000 UTC m=+11824.678464962" lastFinishedPulling="2026-03-17 04:26:42.4851913 +0000 UTC m=+11828.117424278" observedRunningTime="2026-03-17 04:26:43.152329589 +0000 UTC m=+11828.784562587" watchObservedRunningTime="2026-03-17 04:26:43.161604783 +0000 UTC m=+11828.793837751" Mar 17 04:26:47 crc kubenswrapper[4735]: I0317 04:26:47.419076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:47 crc kubenswrapper[4735]: I0317 04:26:47.419660 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:48 crc kubenswrapper[4735]: I0317 04:26:48.473820 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-v92jx" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="registry-server" probeResult="failure" output=< Mar 17 04:26:48 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:26:48 crc kubenswrapper[4735]: > Mar 17 04:26:57 crc kubenswrapper[4735]: I0317 04:26:57.487734 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:57 crc kubenswrapper[4735]: I0317 04:26:57.554931 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:57 crc kubenswrapper[4735]: I0317 04:26:57.724667 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.270935 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v92jx" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="registry-server" containerID="cri-o://240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e" gracePeriod=2 Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.823541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.929079 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content\") pod \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.929411 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities\") pod \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.929614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt6dp\" (UniqueName: \"kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp\") pod \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\" (UID: \"bf79a8e7-8f78-426e-bbe0-ef16d20dc342\") " Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.930176 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities" (OuterVolumeSpecName: "utilities") pod "bf79a8e7-8f78-426e-bbe0-ef16d20dc342" (UID: "bf79a8e7-8f78-426e-bbe0-ef16d20dc342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:26:59 crc kubenswrapper[4735]: I0317 04:26:59.944002 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp" (OuterVolumeSpecName: "kube-api-access-bt6dp") pod "bf79a8e7-8f78-426e-bbe0-ef16d20dc342" (UID: "bf79a8e7-8f78-426e-bbe0-ef16d20dc342"). InnerVolumeSpecName "kube-api-access-bt6dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.001510 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf79a8e7-8f78-426e-bbe0-ef16d20dc342" (UID: "bf79a8e7-8f78-426e-bbe0-ef16d20dc342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.031768 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.032030 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.032095 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt6dp\" (UniqueName: \"kubernetes.io/projected/bf79a8e7-8f78-426e-bbe0-ef16d20dc342-kube-api-access-bt6dp\") on node \"crc\" DevicePath \"\"" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.288284 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerID="240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e" exitCode=0 Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.288372 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92jx" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.288410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerDied","Data":"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e"} Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.289490 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92jx" event={"ID":"bf79a8e7-8f78-426e-bbe0-ef16d20dc342","Type":"ContainerDied","Data":"0dc8bf9fef6c5ff538b407787ed511069649077af5ded16b59547654e229f328"} Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.289568 4735 scope.go:117] "RemoveContainer" containerID="240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.330485 4735 scope.go:117] "RemoveContainer" containerID="fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.336060 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.343005 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v92jx"] Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.354593 4735 scope.go:117] "RemoveContainer" containerID="d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.412559 4735 scope.go:117] "RemoveContainer" containerID="240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e" Mar 17 04:27:00 crc kubenswrapper[4735]: E0317 04:27:00.416559 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e\": container with ID starting with 240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e not found: ID does not exist" containerID="240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.416610 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e"} err="failed to get container status \"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e\": rpc error: code = NotFound desc = could not find container \"240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e\": container with ID starting with 240ea5e82095585a34a5bdada7312cc9f067b293f777a0a7ba3b0f9ecd8b572e not found: ID does not exist" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.416639 4735 scope.go:117] "RemoveContainer" containerID="fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304" Mar 17 04:27:00 crc kubenswrapper[4735]: E0317 04:27:00.417092 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304\": container with ID starting with fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304 not found: ID does not exist" containerID="fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.417120 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304"} err="failed to get container status \"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304\": rpc error: code = NotFound desc = could not find container \"fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304\": container with ID starting with fac3d9d1c0fde0683371de496299183f4ae06a137c454b397e93a4510929f304 not found: ID does not exist" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.417141 4735 scope.go:117] "RemoveContainer" containerID="d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581" Mar 17 04:27:00 crc kubenswrapper[4735]: E0317 04:27:00.417459 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581\": container with ID starting with d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581 not found: ID does not exist" containerID="d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581" Mar 17 04:27:00 crc kubenswrapper[4735]: I0317 04:27:00.417475 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581"} err="failed to get container status \"d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581\": rpc error: code = NotFound desc = could not find container \"d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581\": container with ID starting with d39872a440230812f38bd558d837076f55a9688a851f6c41440f2c26ebe97581 not found: ID does not exist" Mar 17 04:27:01 crc kubenswrapper[4735]: I0317 04:27:01.095833 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" path="/var/lib/kubelet/pods/bf79a8e7-8f78-426e-bbe0-ef16d20dc342/volumes" Mar 17 04:27:13 crc kubenswrapper[4735]: I0317 04:27:13.063378 4735 scope.go:117] "RemoveContainer" containerID="d382c72f7ce3765cc4af5c233dd8ae35b6e204e3e475d39c9efd591883b6a73f" Mar 17 04:27:23 crc kubenswrapper[4735]: I0317 04:27:23.575088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfgbl/must-gather-586s4" event={"ID":"83f7fc6c-fba5-429d-a031-19607709268d","Type":"ContainerDied","Data":"5723ad18f329438013182ef228c80c5551b9f8bde15255a7d675b706e26640dc"} Mar 17 04:27:23 crc kubenswrapper[4735]: I0317 04:27:23.574980 4735 generic.go:334] "Generic (PLEG): container finished" podID="83f7fc6c-fba5-429d-a031-19607709268d" containerID="5723ad18f329438013182ef228c80c5551b9f8bde15255a7d675b706e26640dc" exitCode=0 Mar 17 04:27:23 crc kubenswrapper[4735]: I0317 04:27:23.577103 4735 scope.go:117] "RemoveContainer" containerID="5723ad18f329438013182ef228c80c5551b9f8bde15255a7d675b706e26640dc" Mar 17 04:27:23 crc kubenswrapper[4735]: I0317 04:27:23.990622 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfgbl_must-gather-586s4_83f7fc6c-fba5-429d-a031-19607709268d/gather/0.log" Mar 17 04:27:30 crc kubenswrapper[4735]: E0317 04:27:30.500622 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:60316->38.102.83.65:40841: write tcp 38.102.83.65:60316->38.102.83.65:40841: write: connection reset by peer Mar 17 04:27:36 crc kubenswrapper[4735]: I0317 04:27:36.419780 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tfgbl/must-gather-586s4"] Mar 17 04:27:36 crc kubenswrapper[4735]: I0317 04:27:36.422431 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-tfgbl/must-gather-586s4" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="copy" containerID="cri-o://7bcb43390a975a1b5dc00a7724b87f89fdcc41b07a3375cde841761a849ba6e5" gracePeriod=2 Mar 17 04:27:36 crc kubenswrapper[4735]: I0317 04:27:36.456106 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tfgbl/must-gather-586s4"] Mar 17 04:27:36 crc kubenswrapper[4735]: I0317 04:27:36.707446 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfgbl_must-gather-586s4_83f7fc6c-fba5-429d-a031-19607709268d/copy/0.log" Mar 17 04:27:36 crc kubenswrapper[4735]: I0317 04:27:36.708377 4735 generic.go:334] "Generic (PLEG): container finished" podID="83f7fc6c-fba5-429d-a031-19607709268d" containerID="7bcb43390a975a1b5dc00a7724b87f89fdcc41b07a3375cde841761a849ba6e5" exitCode=143 Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.037052 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfgbl_must-gather-586s4_83f7fc6c-fba5-429d-a031-19607709268d/copy/0.log" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.037554 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.111245 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output\") pod \"83f7fc6c-fba5-429d-a031-19607709268d\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.111470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7mcw\" (UniqueName: \"kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw\") pod \"83f7fc6c-fba5-429d-a031-19607709268d\" (UID: \"83f7fc6c-fba5-429d-a031-19607709268d\") " Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.119103 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw" (OuterVolumeSpecName: "kube-api-access-s7mcw") pod "83f7fc6c-fba5-429d-a031-19607709268d" (UID: "83f7fc6c-fba5-429d-a031-19607709268d"). InnerVolumeSpecName "kube-api-access-s7mcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.213454 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7mcw\" (UniqueName: \"kubernetes.io/projected/83f7fc6c-fba5-429d-a031-19607709268d-kube-api-access-s7mcw\") on node \"crc\" DevicePath \"\"" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.299259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "83f7fc6c-fba5-429d-a031-19607709268d" (UID: "83f7fc6c-fba5-429d-a031-19607709268d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.315427 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f7fc6c-fba5-429d-a031-19607709268d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.718394 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfgbl_must-gather-586s4_83f7fc6c-fba5-429d-a031-19607709268d/copy/0.log" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.720344 4735 scope.go:117] "RemoveContainer" containerID="7bcb43390a975a1b5dc00a7724b87f89fdcc41b07a3375cde841761a849ba6e5" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.720497 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfgbl/must-gather-586s4" Mar 17 04:27:37 crc kubenswrapper[4735]: I0317 04:27:37.744740 4735 scope.go:117] "RemoveContainer" containerID="5723ad18f329438013182ef228c80c5551b9f8bde15255a7d675b706e26640dc" Mar 17 04:27:39 crc kubenswrapper[4735]: I0317 04:27:39.086376 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f7fc6c-fba5-429d-a031-19607709268d" path="/var/lib/kubelet/pods/83f7fc6c-fba5-429d-a031-19607709268d/volumes" Mar 17 04:27:42 crc kubenswrapper[4735]: I0317 04:27:42.606740 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:27:42 crc kubenswrapper[4735]: I0317 04:27:42.607934 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.182261 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562028-lnt8b"] Mar 17 04:28:00 crc kubenswrapper[4735]: E0317 04:28:00.183216 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="copy" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="copy" Mar 17 04:28:00 crc kubenswrapper[4735]: E0317 04:28:00.183249 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="registry-server" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183255 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="registry-server" Mar 17 04:28:00 crc kubenswrapper[4735]: E0317 04:28:00.183273 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="extract-utilities" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183280 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="extract-utilities" Mar 17 04:28:00 crc kubenswrapper[4735]: E0317 04:28:00.183296 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="gather" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="gather" Mar 17 04:28:00 crc kubenswrapper[4735]: E0317 04:28:00.183328 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="extract-content" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183335 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="extract-content" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183525 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf79a8e7-8f78-426e-bbe0-ef16d20dc342" containerName="registry-server" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183541 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="gather" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.183550 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f7fc6c-fba5-429d-a031-19607709268d" containerName="copy" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.185395 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.204162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562028-lnt8b"] Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.212253 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.212458 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.212596 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.303911 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9cs\" (UniqueName: \"kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs\") pod \"auto-csr-approver-29562028-lnt8b\" (UID: \"978340e2-e7dc-4ac5-a821-52b63f4605e4\") " pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.405239 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9cs\" (UniqueName: \"kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs\") pod \"auto-csr-approver-29562028-lnt8b\" (UID: \"978340e2-e7dc-4ac5-a821-52b63f4605e4\") " pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.425624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9cs\" (UniqueName: \"kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs\") pod \"auto-csr-approver-29562028-lnt8b\" (UID: \"978340e2-e7dc-4ac5-a821-52b63f4605e4\") " pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:00 crc kubenswrapper[4735]: I0317 04:28:00.519721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:01 crc kubenswrapper[4735]: I0317 04:28:01.061950 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562028-lnt8b"] Mar 17 04:28:01 crc kubenswrapper[4735]: I0317 04:28:01.976096 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" event={"ID":"978340e2-e7dc-4ac5-a821-52b63f4605e4","Type":"ContainerStarted","Data":"2982d9f85c5302990772661abebbe0e5ee211665349c97f7d2a712670eb1b51e"} Mar 17 04:28:03 crc kubenswrapper[4735]: I0317 04:28:03.996764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" event={"ID":"978340e2-e7dc-4ac5-a821-52b63f4605e4","Type":"ContainerStarted","Data":"e95fded474adb6e72b028e29ca85979f7e5c51bd3b0025c8522d3065b0831c2a"} Mar 17 04:28:04 crc kubenswrapper[4735]: I0317 04:28:04.035100 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" podStartSLOduration=2.838360299 podStartE2EDuration="4.035078696s" podCreationTimestamp="2026-03-17 04:28:00 +0000 UTC" firstStartedPulling="2026-03-17 04:28:01.067320249 +0000 UTC m=+11906.699553237" lastFinishedPulling="2026-03-17 04:28:02.264038636 +0000 UTC m=+11907.896271634" observedRunningTime="2026-03-17 04:28:04.022016652 +0000 UTC m=+11909.654249670" watchObservedRunningTime="2026-03-17 04:28:04.035078696 +0000 UTC m=+11909.667311684" Mar 17 04:28:05 crc kubenswrapper[4735]: I0317 04:28:05.006960 4735 generic.go:334] "Generic (PLEG): container finished" podID="978340e2-e7dc-4ac5-a821-52b63f4605e4" containerID="e95fded474adb6e72b028e29ca85979f7e5c51bd3b0025c8522d3065b0831c2a" exitCode=0 Mar 17 04:28:05 crc kubenswrapper[4735]: I0317 04:28:05.007004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" event={"ID":"978340e2-e7dc-4ac5-a821-52b63f4605e4","Type":"ContainerDied","Data":"e95fded474adb6e72b028e29ca85979f7e5c51bd3b0025c8522d3065b0831c2a"} Mar 17 04:28:06 crc kubenswrapper[4735]: I0317 04:28:06.346422 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:06 crc kubenswrapper[4735]: I0317 04:28:06.428898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9cs\" (UniqueName: \"kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs\") pod \"978340e2-e7dc-4ac5-a821-52b63f4605e4\" (UID: \"978340e2-e7dc-4ac5-a821-52b63f4605e4\") " Mar 17 04:28:06 crc kubenswrapper[4735]: I0317 04:28:06.440699 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs" (OuterVolumeSpecName: "kube-api-access-ml9cs") pod "978340e2-e7dc-4ac5-a821-52b63f4605e4" (UID: "978340e2-e7dc-4ac5-a821-52b63f4605e4"). InnerVolumeSpecName "kube-api-access-ml9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:28:06 crc kubenswrapper[4735]: I0317 04:28:06.531840 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9cs\" (UniqueName: \"kubernetes.io/projected/978340e2-e7dc-4ac5-a821-52b63f4605e4-kube-api-access-ml9cs\") on node \"crc\" DevicePath \"\"" Mar 17 04:28:07 crc kubenswrapper[4735]: I0317 04:28:07.039265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" event={"ID":"978340e2-e7dc-4ac5-a821-52b63f4605e4","Type":"ContainerDied","Data":"2982d9f85c5302990772661abebbe0e5ee211665349c97f7d2a712670eb1b51e"} Mar 17 04:28:07 crc kubenswrapper[4735]: I0317 04:28:07.039302 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2982d9f85c5302990772661abebbe0e5ee211665349c97f7d2a712670eb1b51e" Mar 17 04:28:07 crc kubenswrapper[4735]: I0317 04:28:07.039355 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562028-lnt8b" Mar 17 04:28:07 crc kubenswrapper[4735]: I0317 04:28:07.103896 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562022-qffmh"] Mar 17 04:28:07 crc kubenswrapper[4735]: I0317 04:28:07.112978 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562022-qffmh"] Mar 17 04:28:09 crc kubenswrapper[4735]: I0317 04:28:09.083260 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02dc698c-8baa-46a7-ab60-936c016790b9" path="/var/lib/kubelet/pods/02dc698c-8baa-46a7-ab60-936c016790b9/volumes" Mar 17 04:28:12 crc kubenswrapper[4735]: I0317 04:28:12.606282 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:28:12 crc kubenswrapper[4735]: I0317 04:28:12.606718 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:28:13 crc kubenswrapper[4735]: I0317 04:28:13.169175 4735 scope.go:117] "RemoveContainer" containerID="e0b4583f559a4bcf68ad48a52b38378940c0ab5d18d808b4b307e2509ccb70e2" Mar 17 04:28:13 crc kubenswrapper[4735]: I0317 04:28:13.213185 4735 scope.go:117] "RemoveContainer" containerID="bdafc6053c26298391bb26adb83aae496be9ae81631ea64fead4295ed0a47d18" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.718430 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:18 crc kubenswrapper[4735]: E0317 04:28:18.719235 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978340e2-e7dc-4ac5-a821-52b63f4605e4" containerName="oc" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.719246 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="978340e2-e7dc-4ac5-a821-52b63f4605e4" containerName="oc" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.719452 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="978340e2-e7dc-4ac5-a821-52b63f4605e4" containerName="oc" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.720627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.743624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.877934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd6hv\" (UniqueName: \"kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.878029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.878089 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.980273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd6hv\" (UniqueName: \"kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.980360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.980403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.980935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:18 crc kubenswrapper[4735]: I0317 04:28:18.980982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:19 crc kubenswrapper[4735]: I0317 04:28:19.003405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd6hv\" (UniqueName: \"kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv\") pod \"redhat-marketplace-bsspm\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:19 crc kubenswrapper[4735]: I0317 04:28:19.045783 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:19 crc kubenswrapper[4735]: I0317 04:28:19.542611 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:20 crc kubenswrapper[4735]: I0317 04:28:20.178752 4735 generic.go:334] "Generic (PLEG): container finished" podID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerID="09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2" exitCode=0 Mar 17 04:28:20 crc kubenswrapper[4735]: I0317 04:28:20.178822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerDied","Data":"09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2"} Mar 17 04:28:20 crc kubenswrapper[4735]: I0317 04:28:20.180002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerStarted","Data":"8921ad4235636992fd78db2f6a215dae1ddf0107c49f9053b45ad59d983a7ffc"} Mar 17 04:28:21 crc kubenswrapper[4735]: I0317 04:28:21.188643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerStarted","Data":"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e"} Mar 17 04:28:22 crc kubenswrapper[4735]: I0317 04:28:22.203371 4735 generic.go:334] "Generic (PLEG): container finished" podID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerID="84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e" exitCode=0 Mar 17 04:28:22 crc kubenswrapper[4735]: I0317 04:28:22.203418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerDied","Data":"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e"} Mar 17 04:28:23 crc kubenswrapper[4735]: I0317 04:28:23.214660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerStarted","Data":"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46"} Mar 17 04:28:23 crc kubenswrapper[4735]: I0317 04:28:23.243924 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bsspm" podStartSLOduration=2.8191639459999998 podStartE2EDuration="5.243909193s" podCreationTimestamp="2026-03-17 04:28:18 +0000 UTC" firstStartedPulling="2026-03-17 04:28:20.181070746 +0000 UTC m=+11925.813303734" lastFinishedPulling="2026-03-17 04:28:22.605815963 +0000 UTC m=+11928.238048981" observedRunningTime="2026-03-17 04:28:23.242483769 +0000 UTC m=+11928.874716747" watchObservedRunningTime="2026-03-17 04:28:23.243909193 +0000 UTC m=+11928.876142171" Mar 17 04:28:29 crc kubenswrapper[4735]: I0317 04:28:29.046829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:29 crc kubenswrapper[4735]: I0317 04:28:29.048041 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:29 crc kubenswrapper[4735]: I0317 04:28:29.126207 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:29 crc kubenswrapper[4735]: I0317 04:28:29.337027 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:29 crc kubenswrapper[4735]: I0317 04:28:29.415888 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.290929 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bsspm" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="registry-server" containerID="cri-o://8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46" gracePeriod=2 Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.729433 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.888200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd6hv\" (UniqueName: \"kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv\") pod \"90d32b66-76e7-4a12-b896-96a7491b4d58\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.888426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content\") pod \"90d32b66-76e7-4a12-b896-96a7491b4d58\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.888480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities\") pod \"90d32b66-76e7-4a12-b896-96a7491b4d58\" (UID: \"90d32b66-76e7-4a12-b896-96a7491b4d58\") " Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.889848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities" (OuterVolumeSpecName: "utilities") pod "90d32b66-76e7-4a12-b896-96a7491b4d58" (UID: "90d32b66-76e7-4a12-b896-96a7491b4d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.897097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv" (OuterVolumeSpecName: "kube-api-access-cd6hv") pod "90d32b66-76e7-4a12-b896-96a7491b4d58" (UID: "90d32b66-76e7-4a12-b896-96a7491b4d58"). InnerVolumeSpecName "kube-api-access-cd6hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.914527 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90d32b66-76e7-4a12-b896-96a7491b4d58" (UID: "90d32b66-76e7-4a12-b896-96a7491b4d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.990493 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.990535 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd6hv\" (UniqueName: \"kubernetes.io/projected/90d32b66-76e7-4a12-b896-96a7491b4d58-kube-api-access-cd6hv\") on node \"crc\" DevicePath \"\"" Mar 17 04:28:31 crc kubenswrapper[4735]: I0317 04:28:31.990550 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90d32b66-76e7-4a12-b896-96a7491b4d58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.304948 4735 generic.go:334] "Generic (PLEG): container finished" podID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerID="8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46" exitCode=0 Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.305025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerDied","Data":"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46"} Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.305059 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsspm" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.305099 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsspm" event={"ID":"90d32b66-76e7-4a12-b896-96a7491b4d58","Type":"ContainerDied","Data":"8921ad4235636992fd78db2f6a215dae1ddf0107c49f9053b45ad59d983a7ffc"} Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.305141 4735 scope.go:117] "RemoveContainer" containerID="8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.334887 4735 scope.go:117] "RemoveContainer" containerID="84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.368049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.378141 4735 scope.go:117] "RemoveContainer" containerID="09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.380011 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsspm"] Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.438141 4735 scope.go:117] "RemoveContainer" containerID="8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46" Mar 17 04:28:32 crc kubenswrapper[4735]: E0317 04:28:32.438711 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46\": container with ID starting with 8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46 not found: ID does not exist" containerID="8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.438754 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46"} err="failed to get container status \"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46\": rpc error: code = NotFound desc = could not find container \"8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46\": container with ID starting with 8af0efd8c453eef12dd452bf81acced3ca2a0ccb188d3fd30ce0f7fea921fb46 not found: ID does not exist" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.438777 4735 scope.go:117] "RemoveContainer" containerID="84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e" Mar 17 04:28:32 crc kubenswrapper[4735]: E0317 04:28:32.439159 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e\": container with ID starting with 84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e not found: ID does not exist" containerID="84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.439200 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e"} err="failed to get container status \"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e\": rpc error: code = NotFound desc = could not find container \"84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e\": container with ID starting with 84953c9cd73f3910a6c9990ad9c209e4b09ad7711eceaab96c95e4062ef3501e not found: ID does not exist" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.439228 4735 scope.go:117] "RemoveContainer" containerID="09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2" Mar 17 04:28:32 crc kubenswrapper[4735]: E0317 04:28:32.439521 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2\": container with ID starting with 09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2 not found: ID does not exist" containerID="09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2" Mar 17 04:28:32 crc kubenswrapper[4735]: I0317 04:28:32.439559 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2"} err="failed to get container status \"09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2\": rpc error: code = NotFound desc = could not find container \"09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2\": container with ID starting with 09c462b1b5a24d85faf7b73be06189003b0ae8df8afc4a5015958b37a6c97dd2 not found: ID does not exist" Mar 17 04:28:33 crc kubenswrapper[4735]: I0317 04:28:33.085723 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" path="/var/lib/kubelet/pods/90d32b66-76e7-4a12-b896-96a7491b4d58/volumes" Mar 17 04:28:42 crc kubenswrapper[4735]: I0317 04:28:42.606348 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:28:42 crc kubenswrapper[4735]: I0317 04:28:42.607047 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:28:42 crc kubenswrapper[4735]: I0317 04:28:42.607111 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:28:42 crc kubenswrapper[4735]: I0317 04:28:42.609061 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:28:42 crc kubenswrapper[4735]: I0317 04:28:42.609172 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9" gracePeriod=600 Mar 17 04:28:43 crc kubenswrapper[4735]: I0317 04:28:43.421206 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9" exitCode=0 Mar 17 04:28:43 crc kubenswrapper[4735]: I0317 04:28:43.421282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9"} Mar 17 04:28:43 crc kubenswrapper[4735]: I0317 04:28:43.421762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a"} Mar 17 04:28:43 crc kubenswrapper[4735]: I0317 04:28:43.421784 4735 scope.go:117] "RemoveContainer" containerID="09a5ad1cd01ecfb90862c98378af4b25f1004a85340f37620a11f532f025ff1e" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.870758 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4kfvn/must-gather-nsckr"] Mar 17 04:29:59 crc kubenswrapper[4735]: E0317 04:29:59.871475 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="extract-utilities" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.871487 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="extract-utilities" Mar 17 04:29:59 crc kubenswrapper[4735]: E0317 04:29:59.871531 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="registry-server" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.871537 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="registry-server" Mar 17 04:29:59 crc kubenswrapper[4735]: E0317 04:29:59.871549 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="extract-content" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.871555 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="extract-content" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.871721 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d32b66-76e7-4a12-b896-96a7491b4d58" containerName="registry-server" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.872633 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.875370 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4kfvn"/"default-dockercfg-v868t" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.875533 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4kfvn"/"kube-root-ca.crt" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.876097 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4kfvn"/"openshift-service-ca.crt" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.988839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.989223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhvs\" (UniqueName: \"kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:29:59 crc kubenswrapper[4735]: I0317 04:29:59.991373 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4kfvn/must-gather-nsckr"] Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.091225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.091373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhvs\" (UniqueName: \"kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.092649 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.143745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhvs\" (UniqueName: \"kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs\") pod \"must-gather-nsckr\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.173211 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562030-55hk5"] Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.174257 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.176808 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.179957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.187049 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.189825 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.198409 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn"] Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.203482 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.208228 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.208409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.210957 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562030-55hk5"] Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.281367 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn"] Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.299850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.300026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.300063 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjzxn\" (UniqueName: \"kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.300082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp4sd\" (UniqueName: \"kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd\") pod \"auto-csr-approver-29562030-55hk5\" (UID: \"dc73e95f-ca55-44df-9c4c-6f9a8b412cba\") " pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.401842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.402215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.402318 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjzxn\" (UniqueName: \"kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.402391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp4sd\" (UniqueName: \"kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd\") pod \"auto-csr-approver-29562030-55hk5\" (UID: \"dc73e95f-ca55-44df-9c4c-6f9a8b412cba\") " pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.403070 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.406760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.418205 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp4sd\" (UniqueName: \"kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd\") pod \"auto-csr-approver-29562030-55hk5\" (UID: \"dc73e95f-ca55-44df-9c4c-6f9a8b412cba\") " pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.430843 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjzxn\" (UniqueName: \"kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn\") pod \"collect-profiles-29562030-dgzkn\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.500801 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:00 crc kubenswrapper[4735]: I0317 04:30:00.611306 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:01 crc kubenswrapper[4735]: I0317 04:30:01.285586 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4kfvn/must-gather-nsckr"] Mar 17 04:30:01 crc kubenswrapper[4735]: W0317 04:30:01.332584 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b4da6b_74ed_4bef_920b_8352db585ac6.slice/crio-93088e9322726203acaf5b2005a3133c41aaa44e78c907be344c561f226e3a84 WatchSource:0}: Error finding container 93088e9322726203acaf5b2005a3133c41aaa44e78c907be344c561f226e3a84: Status 404 returned error can't find the container with id 93088e9322726203acaf5b2005a3133c41aaa44e78c907be344c561f226e3a84 Mar 17 04:30:01 crc kubenswrapper[4735]: I0317 04:30:01.407221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562030-55hk5"] Mar 17 04:30:01 crc kubenswrapper[4735]: W0317 04:30:01.434829 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc73e95f_ca55_44df_9c4c_6f9a8b412cba.slice/crio-0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5 WatchSource:0}: Error finding container 0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5: Status 404 returned error can't find the container with id 0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5 Mar 17 04:30:01 crc kubenswrapper[4735]: I0317 04:30:01.592213 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn"] Mar 17 04:30:01 crc kubenswrapper[4735]: W0317 04:30:01.606276 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec88e4f8_2319_4f5c_a520_bc8e916d6159.slice/crio-6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3 WatchSource:0}: Error finding container 6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3: Status 404 returned error can't find the container with id 6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3 Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.303739 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562030-55hk5" event={"ID":"dc73e95f-ca55-44df-9c4c-6f9a8b412cba","Type":"ContainerStarted","Data":"0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.306988 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" event={"ID":"ec88e4f8-2319-4f5c-a520-bc8e916d6159","Type":"ContainerStarted","Data":"f7b2aa0063b2e2cda05f433ac197f056d8c5d35ddefe957eb57fffeea2652224"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.307030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" event={"ID":"ec88e4f8-2319-4f5c-a520-bc8e916d6159","Type":"ContainerStarted","Data":"6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.309199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/must-gather-nsckr" event={"ID":"b8b4da6b-74ed-4bef-920b-8352db585ac6","Type":"ContainerStarted","Data":"2f557cb8adf5b0ffee88ce42452ac554fcc2abd78879695473d5d91e333b7f12"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.309259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/must-gather-nsckr" event={"ID":"b8b4da6b-74ed-4bef-920b-8352db585ac6","Type":"ContainerStarted","Data":"e853d67ba624a1fb26f035cf6ee0dc61191ecf7e1c9e2833068a9a8ca04c52ff"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.309272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/must-gather-nsckr" event={"ID":"b8b4da6b-74ed-4bef-920b-8352db585ac6","Type":"ContainerStarted","Data":"93088e9322726203acaf5b2005a3133c41aaa44e78c907be344c561f226e3a84"} Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.339550 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" podStartSLOduration=2.339532397 podStartE2EDuration="2.339532397s" podCreationTimestamp="2026-03-17 04:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:30:02.333885451 +0000 UTC m=+12027.966118429" watchObservedRunningTime="2026-03-17 04:30:02.339532397 +0000 UTC m=+12027.971765375" Mar 17 04:30:02 crc kubenswrapper[4735]: I0317 04:30:02.363792 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4kfvn/must-gather-nsckr" podStartSLOduration=3.36377529 podStartE2EDuration="3.36377529s" podCreationTimestamp="2026-03-17 04:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:30:02.351481024 +0000 UTC m=+12027.983714012" watchObservedRunningTime="2026-03-17 04:30:02.36377529 +0000 UTC m=+12027.996008268" Mar 17 04:30:03 crc kubenswrapper[4735]: I0317 04:30:03.317529 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" event={"ID":"ec88e4f8-2319-4f5c-a520-bc8e916d6159","Type":"ContainerDied","Data":"f7b2aa0063b2e2cda05f433ac197f056d8c5d35ddefe957eb57fffeea2652224"} Mar 17 04:30:03 crc kubenswrapper[4735]: I0317 04:30:03.318431 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec88e4f8-2319-4f5c-a520-bc8e916d6159" containerID="f7b2aa0063b2e2cda05f433ac197f056d8c5d35ddefe957eb57fffeea2652224" exitCode=0 Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.702666 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.795589 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume\") pod \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.795664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume\") pod \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.795753 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjzxn\" (UniqueName: \"kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn\") pod \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\" (UID: \"ec88e4f8-2319-4f5c-a520-bc8e916d6159\") " Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.798286 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec88e4f8-2319-4f5c-a520-bc8e916d6159" (UID: "ec88e4f8-2319-4f5c-a520-bc8e916d6159"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.811794 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec88e4f8-2319-4f5c-a520-bc8e916d6159" (UID: "ec88e4f8-2319-4f5c-a520-bc8e916d6159"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.814227 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn" (OuterVolumeSpecName: "kube-api-access-tjzxn") pod "ec88e4f8-2319-4f5c-a520-bc8e916d6159" (UID: "ec88e4f8-2319-4f5c-a520-bc8e916d6159"). InnerVolumeSpecName "kube-api-access-tjzxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.901144 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec88e4f8-2319-4f5c-a520-bc8e916d6159-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.901372 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec88e4f8-2319-4f5c-a520-bc8e916d6159-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:04 crc kubenswrapper[4735]: I0317 04:30:04.901439 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjzxn\" (UniqueName: \"kubernetes.io/projected/ec88e4f8-2319-4f5c-a520-bc8e916d6159-kube-api-access-tjzxn\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.352399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562030-55hk5" event={"ID":"dc73e95f-ca55-44df-9c4c-6f9a8b412cba","Type":"ContainerStarted","Data":"296d588bd003825b43bf93956e7e02e0531959b0fff1f2433570de8ebc9bceeb"} Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.357139 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" event={"ID":"ec88e4f8-2319-4f5c-a520-bc8e916d6159","Type":"ContainerDied","Data":"6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3"} Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.357168 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562030-dgzkn" Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.357182 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c5ded8eff69f3b4bdfa5021d26037f3b2e903d2e7469b278ca6f93a1fe197b3" Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.385837 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562030-55hk5" podStartSLOduration=3.2995746710000002 podStartE2EDuration="5.385822544s" podCreationTimestamp="2026-03-17 04:30:00 +0000 UTC" firstStartedPulling="2026-03-17 04:30:01.44420645 +0000 UTC m=+12027.076439428" lastFinishedPulling="2026-03-17 04:30:03.530454323 +0000 UTC m=+12029.162687301" observedRunningTime="2026-03-17 04:30:05.381583542 +0000 UTC m=+12031.013816520" watchObservedRunningTime="2026-03-17 04:30:05.385822544 +0000 UTC m=+12031.018055522" Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.797266 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq"] Mar 17 04:30:05 crc kubenswrapper[4735]: I0317 04:30:05.812906 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561985-5x5nq"] Mar 17 04:30:06 crc kubenswrapper[4735]: I0317 04:30:06.366342 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc73e95f-ca55-44df-9c4c-6f9a8b412cba" containerID="296d588bd003825b43bf93956e7e02e0531959b0fff1f2433570de8ebc9bceeb" exitCode=0 Mar 17 04:30:06 crc kubenswrapper[4735]: I0317 04:30:06.366377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562030-55hk5" event={"ID":"dc73e95f-ca55-44df-9c4c-6f9a8b412cba","Type":"ContainerDied","Data":"296d588bd003825b43bf93956e7e02e0531959b0fff1f2433570de8ebc9bceeb"} Mar 17 04:30:07 crc kubenswrapper[4735]: I0317 04:30:07.085334 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e632db0e-807f-428b-ad2d-e842c3ef7d15" path="/var/lib/kubelet/pods/e632db0e-807f-428b-ad2d-e842c3ef7d15/volumes" Mar 17 04:30:07 crc kubenswrapper[4735]: I0317 04:30:07.782165 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:07 crc kubenswrapper[4735]: I0317 04:30:07.964047 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp4sd\" (UniqueName: \"kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd\") pod \"dc73e95f-ca55-44df-9c4c-6f9a8b412cba\" (UID: \"dc73e95f-ca55-44df-9c4c-6f9a8b412cba\") " Mar 17 04:30:07 crc kubenswrapper[4735]: I0317 04:30:07.970750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd" (OuterVolumeSpecName: "kube-api-access-gp4sd") pod "dc73e95f-ca55-44df-9c4c-6f9a8b412cba" (UID: "dc73e95f-ca55-44df-9c4c-6f9a8b412cba"). InnerVolumeSpecName "kube-api-access-gp4sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.067111 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp4sd\" (UniqueName: \"kubernetes.io/projected/dc73e95f-ca55-44df-9c4c-6f9a8b412cba-kube-api-access-gp4sd\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.171911 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562024-tt9xn"] Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.185264 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562024-tt9xn"] Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.394136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562030-55hk5" event={"ID":"dc73e95f-ca55-44df-9c4c-6f9a8b412cba","Type":"ContainerDied","Data":"0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5"} Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.394393 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0d83997101983f4f477e8f8576bbd54cd3255af01e3906f8e5f57a089e18a5" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.394231 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562030-55hk5" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.998148 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-x4qgm"] Mar 17 04:30:08 crc kubenswrapper[4735]: E0317 04:30:08.998549 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec88e4f8-2319-4f5c-a520-bc8e916d6159" containerName="collect-profiles" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.998565 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec88e4f8-2319-4f5c-a520-bc8e916d6159" containerName="collect-profiles" Mar 17 04:30:08 crc kubenswrapper[4735]: E0317 04:30:08.998588 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc73e95f-ca55-44df-9c4c-6f9a8b412cba" containerName="oc" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.998594 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc73e95f-ca55-44df-9c4c-6f9a8b412cba" containerName="oc" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.998783 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec88e4f8-2319-4f5c-a520-bc8e916d6159" containerName="collect-profiles" Mar 17 04:30:08 crc kubenswrapper[4735]: I0317 04:30:08.998802 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc73e95f-ca55-44df-9c4c-6f9a8b412cba" containerName="oc" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.000681 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.083172 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f01206-a188-41e6-87ec-3f7e3966c641" path="/var/lib/kubelet/pods/d6f01206-a188-41e6-87ec-3f7e3966c641/volumes" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.109835 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.109906 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxpm\" (UniqueName: \"kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.211273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.211919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crxpm\" (UniqueName: \"kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.212940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.229902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxpm\" (UniqueName: \"kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm\") pod \"crc-debug-x4qgm\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.317162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:09 crc kubenswrapper[4735]: I0317 04:30:09.402767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" event={"ID":"3fbbf7b3-12de-4337-bad5-03c181e016e5","Type":"ContainerStarted","Data":"a42e47ceaeb526398d9dcc90cee4597977c8f113ac314ed04937aa680c8badb9"} Mar 17 04:30:10 crc kubenswrapper[4735]: I0317 04:30:10.421360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" event={"ID":"3fbbf7b3-12de-4337-bad5-03c181e016e5","Type":"ContainerStarted","Data":"f1101de629d65852e35adea7d6ede2a6d27e1f1a9e32f7ddc2c2c589a77abacd"} Mar 17 04:30:10 crc kubenswrapper[4735]: I0317 04:30:10.446054 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" podStartSLOduration=2.446037113 podStartE2EDuration="2.446037113s" podCreationTimestamp="2026-03-17 04:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:30:10.441444482 +0000 UTC m=+12036.073677460" watchObservedRunningTime="2026-03-17 04:30:10.446037113 +0000 UTC m=+12036.078270091" Mar 17 04:30:13 crc kubenswrapper[4735]: I0317 04:30:13.399693 4735 scope.go:117] "RemoveContainer" containerID="28b14f41c59197f152fce67a9937502a882cea71b935f9cf4c5afb1cc2365cc4" Mar 17 04:30:13 crc kubenswrapper[4735]: I0317 04:30:13.461942 4735 scope.go:117] "RemoveContainer" containerID="0463ad9c97229910e154383c179d91d5cefdb04eef267f441a39b3882ebc8343" Mar 17 04:30:42 crc kubenswrapper[4735]: I0317 04:30:42.607289 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:30:42 crc kubenswrapper[4735]: I0317 04:30:42.608620 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:30:54 crc kubenswrapper[4735]: I0317 04:30:54.120810 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fbbf7b3-12de-4337-bad5-03c181e016e5" containerID="f1101de629d65852e35adea7d6ede2a6d27e1f1a9e32f7ddc2c2c589a77abacd" exitCode=0 Mar 17 04:30:54 crc kubenswrapper[4735]: I0317 04:30:54.120886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" event={"ID":"3fbbf7b3-12de-4337-bad5-03c181e016e5","Type":"ContainerDied","Data":"f1101de629d65852e35adea7d6ede2a6d27e1f1a9e32f7ddc2c2c589a77abacd"} Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.244310 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.277740 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-x4qgm"] Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.288054 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-x4qgm"] Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.371748 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host\") pod \"3fbbf7b3-12de-4337-bad5-03c181e016e5\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.372292 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crxpm\" (UniqueName: \"kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm\") pod \"3fbbf7b3-12de-4337-bad5-03c181e016e5\" (UID: \"3fbbf7b3-12de-4337-bad5-03c181e016e5\") " Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.371904 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host" (OuterVolumeSpecName: "host") pod "3fbbf7b3-12de-4337-bad5-03c181e016e5" (UID: "3fbbf7b3-12de-4337-bad5-03c181e016e5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.381346 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm" (OuterVolumeSpecName: "kube-api-access-crxpm") pod "3fbbf7b3-12de-4337-bad5-03c181e016e5" (UID: "3fbbf7b3-12de-4337-bad5-03c181e016e5"). InnerVolumeSpecName "kube-api-access-crxpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.474105 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crxpm\" (UniqueName: \"kubernetes.io/projected/3fbbf7b3-12de-4337-bad5-03c181e016e5-kube-api-access-crxpm\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:55 crc kubenswrapper[4735]: I0317 04:30:55.474157 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fbbf7b3-12de-4337-bad5-03c181e016e5-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.137719 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42e47ceaeb526398d9dcc90cee4597977c8f113ac314ed04937aa680c8badb9" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.137789 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-x4qgm" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.526698 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-gjjq5"] Mar 17 04:30:56 crc kubenswrapper[4735]: E0317 04:30:56.527087 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbbf7b3-12de-4337-bad5-03c181e016e5" containerName="container-00" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.527099 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbbf7b3-12de-4337-bad5-03c181e016e5" containerName="container-00" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.527276 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fbbf7b3-12de-4337-bad5-03c181e016e5" containerName="container-00" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.527846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.601194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.601334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqh25\" (UniqueName: \"kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.703552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqh25\" (UniqueName: \"kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.703668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.703771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.718950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqh25\" (UniqueName: \"kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25\") pod \"crc-debug-gjjq5\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:56 crc kubenswrapper[4735]: I0317 04:30:56.841749 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:57 crc kubenswrapper[4735]: I0317 04:30:57.083397 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbbf7b3-12de-4337-bad5-03c181e016e5" path="/var/lib/kubelet/pods/3fbbf7b3-12de-4337-bad5-03c181e016e5/volumes" Mar 17 04:30:57 crc kubenswrapper[4735]: I0317 04:30:57.145752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" event={"ID":"78c4ceb1-bacf-48f2-b670-3270633ba80d","Type":"ContainerStarted","Data":"ac596df3d531cc4da8b0de57384a500dce8b0a3a8eacd182eb072ca32a498b59"} Mar 17 04:30:57 crc kubenswrapper[4735]: I0317 04:30:57.145799 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" event={"ID":"78c4ceb1-bacf-48f2-b670-3270633ba80d","Type":"ContainerStarted","Data":"ba6570376a38c545aab9491278fa0271e679380cb6c1e61c8b0fe8c6dbb7f607"} Mar 17 04:30:57 crc kubenswrapper[4735]: I0317 04:30:57.160630 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" podStartSLOduration=1.160613928 podStartE2EDuration="1.160613928s" podCreationTimestamp="2026-03-17 04:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 04:30:57.15821981 +0000 UTC m=+12082.790452788" watchObservedRunningTime="2026-03-17 04:30:57.160613928 +0000 UTC m=+12082.792846896" Mar 17 04:30:58 crc kubenswrapper[4735]: I0317 04:30:58.154929 4735 generic.go:334] "Generic (PLEG): container finished" podID="78c4ceb1-bacf-48f2-b670-3270633ba80d" containerID="ac596df3d531cc4da8b0de57384a500dce8b0a3a8eacd182eb072ca32a498b59" exitCode=0 Mar 17 04:30:58 crc kubenswrapper[4735]: I0317 04:30:58.155176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" event={"ID":"78c4ceb1-bacf-48f2-b670-3270633ba80d","Type":"ContainerDied","Data":"ac596df3d531cc4da8b0de57384a500dce8b0a3a8eacd182eb072ca32a498b59"} Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.251725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.298846 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-gjjq5"] Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.306613 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-gjjq5"] Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.446994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqh25\" (UniqueName: \"kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25\") pod \"78c4ceb1-bacf-48f2-b670-3270633ba80d\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.447104 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host\") pod \"78c4ceb1-bacf-48f2-b670-3270633ba80d\" (UID: \"78c4ceb1-bacf-48f2-b670-3270633ba80d\") " Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.447417 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host" (OuterVolumeSpecName: "host") pod "78c4ceb1-bacf-48f2-b670-3270633ba80d" (UID: "78c4ceb1-bacf-48f2-b670-3270633ba80d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.454560 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78c4ceb1-bacf-48f2-b670-3270633ba80d-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.463913 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25" (OuterVolumeSpecName: "kube-api-access-zqh25") pod "78c4ceb1-bacf-48f2-b670-3270633ba80d" (UID: "78c4ceb1-bacf-48f2-b670-3270633ba80d"). InnerVolumeSpecName "kube-api-access-zqh25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:30:59 crc kubenswrapper[4735]: I0317 04:30:59.555958 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqh25\" (UniqueName: \"kubernetes.io/projected/78c4ceb1-bacf-48f2-b670-3270633ba80d-kube-api-access-zqh25\") on node \"crc\" DevicePath \"\"" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.173000 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6570376a38c545aab9491278fa0271e679380cb6c1e61c8b0fe8c6dbb7f607" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.173058 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-gjjq5" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.542631 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-zsqqm"] Mar 17 04:31:00 crc kubenswrapper[4735]: E0317 04:31:00.543040 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c4ceb1-bacf-48f2-b670-3270633ba80d" containerName="container-00" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.543053 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c4ceb1-bacf-48f2-b670-3270633ba80d" containerName="container-00" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.543210 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c4ceb1-bacf-48f2-b670-3270633ba80d" containerName="container-00" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.543770 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.706307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxpr\" (UniqueName: \"kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.706457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.807729 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slxpr\" (UniqueName: \"kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.808162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.808280 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.835021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxpr\" (UniqueName: \"kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr\") pod \"crc-debug-zsqqm\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: I0317 04:31:00.859037 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:00 crc kubenswrapper[4735]: W0317 04:31:00.885757 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9795e2c6_8a1a_4fb3_afbb_da0ca0be2af8.slice/crio-3b58150509a60c80be1503e95ba9c1a904bea6e2b4c0fc3b30aa5ff862ca2d43 WatchSource:0}: Error finding container 3b58150509a60c80be1503e95ba9c1a904bea6e2b4c0fc3b30aa5ff862ca2d43: Status 404 returned error can't find the container with id 3b58150509a60c80be1503e95ba9c1a904bea6e2b4c0fc3b30aa5ff862ca2d43 Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.083547 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c4ceb1-bacf-48f2-b670-3270633ba80d" path="/var/lib/kubelet/pods/78c4ceb1-bacf-48f2-b670-3270633ba80d/volumes" Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.181050 4735 generic.go:334] "Generic (PLEG): container finished" podID="9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" containerID="dab68ad695a329090c87a3bec2da7ed676e0b90a0a3ead2a3eec75ee565baf5e" exitCode=0 Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.181086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" event={"ID":"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8","Type":"ContainerDied","Data":"dab68ad695a329090c87a3bec2da7ed676e0b90a0a3ead2a3eec75ee565baf5e"} Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.181109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" event={"ID":"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8","Type":"ContainerStarted","Data":"3b58150509a60c80be1503e95ba9c1a904bea6e2b4c0fc3b30aa5ff862ca2d43"} Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.220119 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-zsqqm"] Mar 17 04:31:01 crc kubenswrapper[4735]: I0317 04:31:01.228090 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4kfvn/crc-debug-zsqqm"] Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.305787 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.333087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slxpr\" (UniqueName: \"kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr\") pod \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.333153 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host\") pod \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\" (UID: \"9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8\") " Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.333490 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host" (OuterVolumeSpecName: "host") pod "9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" (UID: "9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.346108 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr" (OuterVolumeSpecName: "kube-api-access-slxpr") pod "9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" (UID: "9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8"). InnerVolumeSpecName "kube-api-access-slxpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.434564 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slxpr\" (UniqueName: \"kubernetes.io/projected/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-kube-api-access-slxpr\") on node \"crc\" DevicePath \"\"" Mar 17 04:31:02 crc kubenswrapper[4735]: I0317 04:31:02.434589 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8-host\") on node \"crc\" DevicePath \"\"" Mar 17 04:31:03 crc kubenswrapper[4735]: I0317 04:31:03.085343 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" path="/var/lib/kubelet/pods/9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8/volumes" Mar 17 04:31:03 crc kubenswrapper[4735]: I0317 04:31:03.198045 4735 scope.go:117] "RemoveContainer" containerID="dab68ad695a329090c87a3bec2da7ed676e0b90a0a3ead2a3eec75ee565baf5e" Mar 17 04:31:03 crc kubenswrapper[4735]: I0317 04:31:03.198177 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/crc-debug-zsqqm" Mar 17 04:31:12 crc kubenswrapper[4735]: I0317 04:31:12.605967 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:31:12 crc kubenswrapper[4735]: I0317 04:31:12.606503 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.044160 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648b74c4fd-9pw4q_f33736df-541b-4d36-bde6-08767c233625/barbican-api/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.134728 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648b74c4fd-9pw4q_f33736df-541b-4d36-bde6-08767c233625/barbican-api-log/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.298389 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-55996449db-5bw4p_8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f/barbican-keystone-listener/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.473909 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-55996449db-5bw4p_8dd863b4-c61a-4c9b-9a72-f2fe04e3ce3f/barbican-keystone-listener-log/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.580153 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-764bc646f-l9wnd_e4e10703-a66a-4bb3-b523-8d5d6e772c04/barbican-worker-log/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.589633 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-764bc646f-l9wnd_e4e10703-a66a-4bb3-b523-8d5d6e772c04/barbican-worker/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.753137 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zqlf9_33665f4a-8504-4b35-9850-d2a567b93418/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:37 crc kubenswrapper[4735]: I0317 04:31:37.873809 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/ceilometer-central-agent/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.076248 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/proxy-httpd/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.096667 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/ceilometer-notification-agent/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.157223 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1e8c020-341a-416d-9816-fe9fece292ec/sg-core/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.419771 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e91a303-0695-4862-ad03-3c9828b5a3a5/cinder-api-log/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.431219 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e91a303-0695-4862-ad03-3c9828b5a3a5/cinder-api/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.498605 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28b4ea44-a1cd-4c67-935c-abdfa2ddb16a/cinder-scheduler/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.735276 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r2xjp_ecd2012c-981d-4582-8dde-afedd29a4108/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.736874 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28b4ea44-a1cd-4c67-935c-abdfa2ddb16a/probe/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.907538 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9jg77_eb6e1c65-c874-45d5-9c38-203b56f1385c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:38 crc kubenswrapper[4735]: I0317 04:31:38.986000 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/init/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.159100 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/init/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.319398 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-w8qhj_8697513f-4f89-4aad-9315-2e2053207433/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.443700 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-745d4b7977-nmmzk_274120fc-6c2e-4443-838e-3657b2f4eeef/dnsmasq-dns/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.524038 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc51e2e8-eeb0-49e2-8893-f2d1aca3be95/glance-httpd/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.582551 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc51e2e8-eeb0-49e2-8893-f2d1aca3be95/glance-log/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.696545 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e8f3c02b-c96f-469b-86df-21a5342bca54/glance-log/0.log" Mar 17 04:31:39 crc kubenswrapper[4735]: I0317 04:31:39.747108 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e8f3c02b-c96f-469b-86df-21a5342bca54/glance-httpd/0.log" Mar 17 04:31:40 crc kubenswrapper[4735]: I0317 04:31:40.367394 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-55bb658757-9w8c7_4da6ec53-9801-4b92-b096-a55e63155103/heat-engine/0.log" Mar 17 04:31:40 crc kubenswrapper[4735]: I0317 04:31:40.812589 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76cdf95cd8-vx5pd_fc3f6d90-40e7-4962-b788-1e9924edb48f/horizon/0.log" Mar 17 04:31:41 crc kubenswrapper[4735]: I0317 04:31:41.491210 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vw7kk_9853dc02-da1f-43e7-ab8a-aa9aeae6ff65/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:41 crc kubenswrapper[4735]: I0317 04:31:41.927092 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6c568f7f98-2x6ww_0ac8a8c1-5baf-48c7-886f-8701fa5ce663/heat-api/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.059949 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pgc7b_cb9f2661-6c58-4a56-8080-2701d2dc456a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.110459 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7fb7d7d79f-sbp8m_7534832e-8997-4cf3-8f8a-c8f91debac15/heat-cfnapi/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.363822 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561881-lgzt4_9576ffce-8ade-47e7-8f60-c928cb0116b5/keystone-cron/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.586271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561941-shsnj_fde14ac4-f83c-412e-ba07-d1ce0e8368d6/keystone-cron/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.605936 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.606006 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.606040 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.608206 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.609210 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" gracePeriod=600 Mar 17 04:31:42 crc kubenswrapper[4735]: E0317 04:31:42.735752 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.766660 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29562001-vwd6p_1115ef04-a8ec-44b6-941d-a540ac1c895b/keystone-cron/0.log" Mar 17 04:31:42 crc kubenswrapper[4735]: I0317 04:31:42.978522 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0da67913-d53f-43fb-8100-c10acda35893/kube-state-metrics/0.log" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.023907 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76cdf95cd8-vx5pd_fc3f6d90-40e7-4962-b788-1e9924edb48f/horizon-log/0.log" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.236975 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vswmz_7354d451-8e24-4c65-855c-e6a33c66d134/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.585508 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ddf9f84d9-bh5g9_c6f23de1-8a7f-4f5f-986a-df17ee65d752/neutron-httpd/0.log" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.588485 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" exitCode=0 Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.588527 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a"} Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.588563 4735 scope.go:117] "RemoveContainer" containerID="9244f6b071a8f0899af1e5371b673d4f6c56665e99d03dbd6f129f288c94bfa9" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.589204 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:31:43 crc kubenswrapper[4735]: E0317 04:31:43.589419 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.862501 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9455b57b9-66mtr_5a6d3039-699c-477b-a835-2b6fa2709dde/keystone-api/0.log" Mar 17 04:31:43 crc kubenswrapper[4735]: I0317 04:31:43.891661 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2fm6r_3a17772e-46c6-4b61-8fd1-0626f2097355/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:45 crc kubenswrapper[4735]: I0317 04:31:45.011932 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ddf9f84d9-bh5g9_c6f23de1-8a7f-4f5f-986a-df17ee65d752/neutron-api/0.log" Mar 17 04:31:45 crc kubenswrapper[4735]: I0317 04:31:45.027342 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8aaa374c-be86-4c12-81f6-cef430c7a160/nova-cell0-conductor-conductor/0.log" Mar 17 04:31:45 crc kubenswrapper[4735]: I0317 04:31:45.497152 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0959245b-5ccd-4b6a-925a-1032c2761405/nova-cell1-conductor-conductor/0.log" Mar 17 04:31:45 crc kubenswrapper[4735]: I0317 04:31:45.961754 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8s2pf_69e281de-fd13-450e-acf8-ee5f0561f0b9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:45 crc kubenswrapper[4735]: I0317 04:31:45.976672 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f7a2283d-990f-46e0-b9e7-e2891468873c/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 04:31:46 crc kubenswrapper[4735]: I0317 04:31:46.334470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6a8bc38a-47a7-4623-bf64-ac77d137ceb4/nova-metadata-log/0.log" Mar 17 04:31:47 crc kubenswrapper[4735]: I0317 04:31:47.275807 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976/nova-api-log/0.log" Mar 17 04:31:47 crc kubenswrapper[4735]: I0317 04:31:47.679306 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6a1a9e1a-0b96-443c-a9f6-ca6e1f0e8e7b/nova-scheduler-scheduler/0.log" Mar 17 04:31:47 crc kubenswrapper[4735]: I0317 04:31:47.727045 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/mysql-bootstrap/0.log" Mar 17 04:31:47 crc kubenswrapper[4735]: I0317 04:31:47.918239 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/mysql-bootstrap/0.log" Mar 17 04:31:47 crc kubenswrapper[4735]: I0317 04:31:47.984039 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_632237b7-0f5e-426d-ae9e-e434ac0e1da6/galera/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.192409 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/mysql-bootstrap/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.476007 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/mysql-bootstrap/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.498516 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_479dabcd-a158-4f12-9baf-77981001d3b3/galera/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.620304 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6a8bc38a-47a7-4623-bf64-ac77d137ceb4/nova-metadata-metadata/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.741966 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8bc7db93-d3a0-4af3-b0e8-5cd148eaeb2c/openstackclient/0.log" Mar 17 04:31:48 crc kubenswrapper[4735]: I0317 04:31:48.990199 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9ngz2_8d35e229-2eeb-4843-a9a2-763156affef5/ovn-controller/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.100005 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lg72z_5bf91339-aeb6-4e5f-b712-295b0fcd38b9/openstack-network-exporter/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.274006 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server-init/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.503376 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a4e1e35-1f1f-4ea3-bfd9-3732b1b0a976/nova-api-api/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.503546 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server-init/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.520263 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovs-vswitchd/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.577401 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ffshg_e92e2c9c-efdb-43cf-9a17-cde88240c670/ovsdb-server/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.820141 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff00868f-9bd8-42cd-818b-b69373acd188/openstack-network-exporter/0.log" Mar 17 04:31:49 crc kubenswrapper[4735]: I0317 04:31:49.834220 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pwsgn_039a6699-7fc5-48c1-89b0-0e3946f69349/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.087589 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff00868f-9bd8-42cd-818b-b69373acd188/ovn-northd/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.104329 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8507883-2fad-4c5c-90c9-58eb17711bb3/openstack-network-exporter/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.191813 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8507883-2fad-4c5c-90c9-58eb17711bb3/ovsdbserver-nb/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.360384 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6be8d2b-aab5-415b-bb98-298563e9f719/ovsdbserver-sb/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.409659 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6be8d2b-aab5-415b-bb98-298563e9f719/openstack-network-exporter/0.log" Mar 17 04:31:50 crc kubenswrapper[4735]: I0317 04:31:50.775481 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/setup-container/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.030048 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/setup-container/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.087902 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5023c673-f338-49b0-b6ef-9bf53abfdb28/rabbitmq/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.236147 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9b8f4bc48-rdtm4_d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a/placement-api/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.314672 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/setup-container/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.394733 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9b8f4bc48-rdtm4_d9f98e78-3f06-4cd7-b1e0-1df6fd9e397a/placement-log/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.616780 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/setup-container/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.617065 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_477a21f3-fdbe-42ea-bcd9-05fc4dca6a52/rabbitmq/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.678529 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ljbn9_2a7a18af-48a3-47f8-8318-e6070738d82f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.875795 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6t9fc_afc10dcf-31e4-477c-8f5c-310c6da60988/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:51 crc kubenswrapper[4735]: I0317 04:31:51.976402 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xtdjn_9cc576b2-b1b2-426f-bec9-ebcd2ad12150/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.256190 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-p8mnq_6136d26b-0db1-42a3-80df-aac1dd6daf50/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.338150 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4j8fl_62ab1715-a702-460a-9195-4646d98e2620/ssh-known-hosts-edpm-deployment/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.591335 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544585c649-tthkx_944ea7da-36e9-4cb9-a65c-ff8730df5107/proxy-server/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.719738 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j75nx_a2639420-3978-4b55-a81e-f1e770e09cf2/swift-ring-rebalance/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.816025 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-auditor/0.log" Mar 17 04:31:52 crc kubenswrapper[4735]: I0317 04:31:52.973260 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-reaper/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.106101 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544585c649-tthkx_944ea7da-36e9-4cb9-a65c-ff8730df5107/proxy-httpd/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.119836 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-server/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.172117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/account-replicator/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.254493 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-auditor/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.342989 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-server/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.475035 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-updater/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.477148 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/container-replicator/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.577302 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-auditor/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.615834 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-expirer/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.748176 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-server/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.778433 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-replicator/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.892706 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/object-updater/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.906765 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/rsync/0.log" Mar 17 04:31:53 crc kubenswrapper[4735]: I0317 04:31:53.997511 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_630046af-37dd-4e3d-9109-c130caec8508/swift-recon-cron/0.log" Mar 17 04:31:54 crc kubenswrapper[4735]: I0317 04:31:54.287069 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-multi-thread-testing_07c07abb-1d6c-4fa3-b0ea-5f67c8bb2f0d/tempest-tests-tempest-tests-runner/0.log" Mar 17 04:31:54 crc kubenswrapper[4735]: I0317 04:31:54.333675 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pjbrs_d4354aba-95a7-4d25-a2e5-5935a961a0d1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:54 crc kubenswrapper[4735]: I0317 04:31:54.490476 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-thread-testing_db53fa15-c77f-4396-aaae-d0110e90ddb6/tempest-tests-tempest-tests-runner/0.log" Mar 17 04:31:54 crc kubenswrapper[4735]: I0317 04:31:54.664103 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c0b1008b-df7f-4c35-96c5-ddb5629af0f4/test-operator-logs-container/0.log" Mar 17 04:31:54 crc kubenswrapper[4735]: I0317 04:31:54.790190 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x9cx5_979aea20-7a48-4bff-9188-685c664c6a78/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 04:31:55 crc kubenswrapper[4735]: I0317 04:31:55.080333 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:31:55 crc kubenswrapper[4735]: E0317 04:31:55.080584 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.166237 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562032-rqxmk"] Mar 17 04:32:00 crc kubenswrapper[4735]: E0317 04:32:00.167072 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" containerName="container-00" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.167084 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" containerName="container-00" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.167258 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9795e2c6-8a1a-4fb3-afbb-da0ca0be2af8" containerName="container-00" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.169446 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.181104 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562032-rqxmk"] Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.198947 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.218866 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.222702 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.322254 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k9gc\" (UniqueName: \"kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc\") pod \"auto-csr-approver-29562032-rqxmk\" (UID: \"b783d962-043c-482f-b478-df8868d7672e\") " pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.423566 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k9gc\" (UniqueName: \"kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc\") pod \"auto-csr-approver-29562032-rqxmk\" (UID: \"b783d962-043c-482f-b478-df8868d7672e\") " pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.446506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k9gc\" (UniqueName: \"kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc\") pod \"auto-csr-approver-29562032-rqxmk\" (UID: \"b783d962-043c-482f-b478-df8868d7672e\") " pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:00 crc kubenswrapper[4735]: I0317 04:32:00.516561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:01 crc kubenswrapper[4735]: I0317 04:32:01.465550 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:32:01 crc kubenswrapper[4735]: I0317 04:32:01.491683 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562032-rqxmk"] Mar 17 04:32:01 crc kubenswrapper[4735]: I0317 04:32:01.797632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" event={"ID":"b783d962-043c-482f-b478-df8868d7672e","Type":"ContainerStarted","Data":"36bdde7e0565940f48203c67ecb4c5fdf0b9c41c66a03cd6a53e01ea916c8202"} Mar 17 04:32:03 crc kubenswrapper[4735]: I0317 04:32:03.817537 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" event={"ID":"b783d962-043c-482f-b478-df8868d7672e","Type":"ContainerStarted","Data":"393c522f0407555dd2b05b8b167083d78b0e3822dad196a10cb188c2583accdd"} Mar 17 04:32:03 crc kubenswrapper[4735]: I0317 04:32:03.859301 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" podStartSLOduration=1.871909662 podStartE2EDuration="3.859282593s" podCreationTimestamp="2026-03-17 04:32:00 +0000 UTC" firstStartedPulling="2026-03-17 04:32:01.457738835 +0000 UTC m=+12147.089971813" lastFinishedPulling="2026-03-17 04:32:03.445111766 +0000 UTC m=+12149.077344744" observedRunningTime="2026-03-17 04:32:03.838770439 +0000 UTC m=+12149.471003417" watchObservedRunningTime="2026-03-17 04:32:03.859282593 +0000 UTC m=+12149.491515571" Mar 17 04:32:05 crc kubenswrapper[4735]: I0317 04:32:05.139387 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a6d7635e-1b43-4bd6-af6d-234b9502545e/memcached/0.log" Mar 17 04:32:05 crc kubenswrapper[4735]: I0317 04:32:05.842939 4735 generic.go:334] "Generic (PLEG): container finished" podID="b783d962-043c-482f-b478-df8868d7672e" containerID="393c522f0407555dd2b05b8b167083d78b0e3822dad196a10cb188c2583accdd" exitCode=0 Mar 17 04:32:05 crc kubenswrapper[4735]: I0317 04:32:05.842975 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" event={"ID":"b783d962-043c-482f-b478-df8868d7672e","Type":"ContainerDied","Data":"393c522f0407555dd2b05b8b167083d78b0e3822dad196a10cb188c2583accdd"} Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.230100 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.274914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k9gc\" (UniqueName: \"kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc\") pod \"b783d962-043c-482f-b478-df8868d7672e\" (UID: \"b783d962-043c-482f-b478-df8868d7672e\") " Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.283161 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc" (OuterVolumeSpecName: "kube-api-access-7k9gc") pod "b783d962-043c-482f-b478-df8868d7672e" (UID: "b783d962-043c-482f-b478-df8868d7672e"). InnerVolumeSpecName "kube-api-access-7k9gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.377135 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k9gc\" (UniqueName: \"kubernetes.io/projected/b783d962-043c-482f-b478-df8868d7672e-kube-api-access-7k9gc\") on node \"crc\" DevicePath \"\"" Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.869582 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" event={"ID":"b783d962-043c-482f-b478-df8868d7672e","Type":"ContainerDied","Data":"36bdde7e0565940f48203c67ecb4c5fdf0b9c41c66a03cd6a53e01ea916c8202"} Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.869623 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36bdde7e0565940f48203c67ecb4c5fdf0b9c41c66a03cd6a53e01ea916c8202" Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.869676 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562032-rqxmk" Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.956334 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562026-vlzvh"] Mar 17 04:32:07 crc kubenswrapper[4735]: I0317 04:32:07.983251 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562026-vlzvh"] Mar 17 04:32:09 crc kubenswrapper[4735]: I0317 04:32:09.073779 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:32:09 crc kubenswrapper[4735]: E0317 04:32:09.074277 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:32:09 crc kubenswrapper[4735]: I0317 04:32:09.082071 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72300078-c045-4609-a04b-980cf6e17df9" path="/var/lib/kubelet/pods/72300078-c045-4609-a04b-980cf6e17df9/volumes" Mar 17 04:32:13 crc kubenswrapper[4735]: I0317 04:32:13.694196 4735 scope.go:117] "RemoveContainer" containerID="2388307c3fe38c2a8b4f01dcbbefd2beccb29ec0562ede7ace196cd5bc9b06c7" Mar 17 04:32:20 crc kubenswrapper[4735]: I0317 04:32:20.073848 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:32:20 crc kubenswrapper[4735]: E0317 04:32:20.074829 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:32:27 crc kubenswrapper[4735]: I0317 04:32:27.689419 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:32:27 crc kubenswrapper[4735]: I0317 04:32:27.910350 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:32:27 crc kubenswrapper[4735]: I0317 04:32:27.962528 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:32:27 crc kubenswrapper[4735]: I0317 04:32:27.980255 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:32:28 crc kubenswrapper[4735]: I0317 04:32:28.195921 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/util/0.log" Mar 17 04:32:28 crc kubenswrapper[4735]: I0317 04:32:28.199697 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/pull/0.log" Mar 17 04:32:28 crc kubenswrapper[4735]: I0317 04:32:28.273833 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c12dfed765d9187381bb8cb0f7761f41fcb34cb7d4d388a3d80db98ed5c9st_2df0e81e-3a8a-4a41-947e-4a9a86ef50ed/extract/0.log" Mar 17 04:32:28 crc kubenswrapper[4735]: I0317 04:32:28.515675 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-wtjk5_33c29a1c-b6e1-4b71-a0ed-b9a8851a0558/manager/0.log" Mar 17 04:32:28 crc kubenswrapper[4735]: I0317 04:32:28.889675 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-mrkzz_4b767f40-0d53-4067-a546-0f14da7659bc/manager/0.log" Mar 17 04:32:29 crc kubenswrapper[4735]: I0317 04:32:29.084844 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-2tj58_da31620d-afd3-4129-a12b-bfddaead4abd/manager/0.log" Mar 17 04:32:29 crc kubenswrapper[4735]: I0317 04:32:29.246604 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2gnd8_250680a5-b697-4c2b-9180-3919204f246e/manager/0.log" Mar 17 04:32:29 crc kubenswrapper[4735]: I0317 04:32:29.514980 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-r4z8g_a628f7da-d487-43c7-9965-5697505667fb/manager/0.log" Mar 17 04:32:29 crc kubenswrapper[4735]: I0317 04:32:29.934836 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-t6nkw_8c6fa0c7-2f56-4498-87ee-7ae0f64f262e/manager/0.log" Mar 17 04:32:30 crc kubenswrapper[4735]: I0317 04:32:30.161202 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-v78gc_5fc876da-b6d0-4c8b-ab2e-84558e5ba079/manager/0.log" Mar 17 04:32:30 crc kubenswrapper[4735]: I0317 04:32:30.561940 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-vm55l_b596dfa6-ef2c-4c3c-80fb-f18229f7b99f/manager/0.log" Mar 17 04:32:30 crc kubenswrapper[4735]: I0317 04:32:30.755886 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-7cjd9_9c858fbe-58b8-4dea-91e5-05366d1bd648/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.124288 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dsd6v_37fd8e6e-e2a0-4bc5-8d9e-6ea87f2a0575/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.146223 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-gwwjh_4701b52d-3086-4792-bc35-c51cf4d63ad8/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.197009 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-4dmmd_6306f8c1-066b-46e3-b76c-5490680c0ae3/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.472721 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-kklq4_eefbeadd-f37e-4419-8d66-ba1731016fd0/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.501210 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-lhsm8_486810ee-5bdf-451a-bc69-179723bbe75d/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.725565 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-fwv89_77baae0d-2b5c-4b05-ab71-c87259ef645a/manager/0.log" Mar 17 04:32:31 crc kubenswrapper[4735]: I0317 04:32:31.889226 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6597c75466-b4kfj_f640bb25-8c1e-4718-9f44-dec9ab10fbb9/operator/0.log" Mar 17 04:32:32 crc kubenswrapper[4735]: I0317 04:32:32.078255 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-djqxj_0b8682ae-cefe-457b-b2b2-76753bc1db5f/registry-server/0.log" Mar 17 04:32:32 crc kubenswrapper[4735]: I0317 04:32:32.513289 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-t8wv8_1f207812-4855-4e6c-9c7d-64d45ac3c917/manager/0.log" Mar 17 04:32:32 crc kubenswrapper[4735]: I0317 04:32:32.552946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-257t8_ae2a2402-a926-4331-aed4-5b25fc55b9ba/manager/0.log" Mar 17 04:32:32 crc kubenswrapper[4735]: I0317 04:32:32.741826 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5brsc_c86b3820-e6e8-41e2-85f4-69a8fda476a2/operator/0.log" Mar 17 04:32:33 crc kubenswrapper[4735]: I0317 04:32:33.085718 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-v2knm_4ffd9139-49ec-4622-bed5-0744011e0469/manager/0.log" Mar 17 04:32:33 crc kubenswrapper[4735]: I0317 04:32:33.142423 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-576dc457f-z87tj_9a63ed64-5e87-4f3d-8568-284237818e90/manager/0.log" Mar 17 04:32:33 crc kubenswrapper[4735]: I0317 04:32:33.199114 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-wnncr_e786fada-7145-4c03-a0bb-073321237c38/manager/0.log" Mar 17 04:32:33 crc kubenswrapper[4735]: I0317 04:32:33.322626 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-p9zdz_9e4382a5-eb49-4a6b-8ee0-4692cad88ae7/manager/0.log" Mar 17 04:32:33 crc kubenswrapper[4735]: I0317 04:32:33.380046 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-txd6q_f6c692fb-2aa5-47c4-8036-265ad9d63131/manager/0.log" Mar 17 04:32:34 crc kubenswrapper[4735]: I0317 04:32:34.074026 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:32:34 crc kubenswrapper[4735]: E0317 04:32:34.074461 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:32:49 crc kubenswrapper[4735]: I0317 04:32:49.073278 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:32:49 crc kubenswrapper[4735]: E0317 04:32:49.073929 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:32:55 crc kubenswrapper[4735]: I0317 04:32:55.617188 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hsd5h_76c60596-d8ac-452c-bf33-b35157ee0970/control-plane-machine-set-operator/0.log" Mar 17 04:32:55 crc kubenswrapper[4735]: I0317 04:32:55.779114 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqdp7_d1d16522-3b99-4ee4-b1ae-901b135c661d/kube-rbac-proxy/0.log" Mar 17 04:32:55 crc kubenswrapper[4735]: I0317 04:32:55.870034 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqdp7_d1d16522-3b99-4ee4-b1ae-901b135c661d/machine-api-operator/0.log" Mar 17 04:33:02 crc kubenswrapper[4735]: I0317 04:33:02.072707 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:33:02 crc kubenswrapper[4735]: E0317 04:33:02.073518 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:33:10 crc kubenswrapper[4735]: I0317 04:33:10.688359 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4t98c_a4d0b9dc-dc40-431d-9fc3-8a45378faaf9/cert-manager-controller/0.log" Mar 17 04:33:10 crc kubenswrapper[4735]: I0317 04:33:10.952212 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5g5gl_f026bd7d-4093-433f-b42a-e2f88dbd2c7f/cert-manager-cainjector/0.log" Mar 17 04:33:11 crc kubenswrapper[4735]: I0317 04:33:11.016395 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-g9frk_2c5a87ba-17e0-4807-980f-f42af0ffb51a/cert-manager-webhook/0.log" Mar 17 04:33:17 crc kubenswrapper[4735]: I0317 04:33:17.074132 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:33:17 crc kubenswrapper[4735]: E0317 04:33:17.074932 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:33:25 crc kubenswrapper[4735]: I0317 04:33:25.491641 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-4jtvs_dfa311dd-f771-40b1-9b71-0dd3f2e09ba6/nmstate-console-plugin/0.log" Mar 17 04:33:25 crc kubenswrapper[4735]: I0317 04:33:25.688609 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qvt46_5f46807e-1877-476e-aeb0-7e1acfe206da/nmstate-handler/0.log" Mar 17 04:33:25 crc kubenswrapper[4735]: I0317 04:33:25.743338 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tl9ds_1a943747-e426-437c-a203-7d326d2b1cc1/kube-rbac-proxy/0.log" Mar 17 04:33:25 crc kubenswrapper[4735]: I0317 04:33:25.803392 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tl9ds_1a943747-e426-437c-a203-7d326d2b1cc1/nmstate-metrics/0.log" Mar 17 04:33:25 crc kubenswrapper[4735]: I0317 04:33:25.964590 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-6h9tn_a2a8ea5f-24b7-4ba0-a197-9f9700436a0e/nmstate-operator/0.log" Mar 17 04:33:26 crc kubenswrapper[4735]: I0317 04:33:26.093836 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rbclm_b4c72542-813d-4029-a6ef-f76feb3f6459/nmstate-webhook/0.log" Mar 17 04:33:32 crc kubenswrapper[4735]: I0317 04:33:32.072891 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:33:32 crc kubenswrapper[4735]: E0317 04:33:32.073512 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:33:44 crc kubenswrapper[4735]: I0317 04:33:44.072918 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:33:44 crc kubenswrapper[4735]: E0317 04:33:44.073599 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:33:56 crc kubenswrapper[4735]: I0317 04:33:56.073574 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:33:56 crc kubenswrapper[4735]: E0317 04:33:56.074594 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:33:59 crc kubenswrapper[4735]: I0317 04:33:59.784484 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-788lk_1edf1039-ebea-4804-9f30-6844633b7919/kube-rbac-proxy/0.log" Mar 17 04:33:59 crc kubenswrapper[4735]: I0317 04:33:59.841002 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-788lk_1edf1039-ebea-4804-9f30-6844633b7919/controller/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.087220 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.143170 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562034-g9whz"] Mar 17 04:34:00 crc kubenswrapper[4735]: E0317 04:34:00.143826 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b783d962-043c-482f-b478-df8868d7672e" containerName="oc" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.143842 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b783d962-043c-482f-b478-df8868d7672e" containerName="oc" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.144030 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b783d962-043c-482f-b478-df8868d7672e" containerName="oc" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.144607 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.147558 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.147694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.147906 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.165071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562034-g9whz"] Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.314401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhz7\" (UniqueName: \"kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7\") pod \"auto-csr-approver-29562034-g9whz\" (UID: \"922fb719-bcea-458f-9d6a-b10825eafaaa\") " pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.343286 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.379574 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.388270 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.415838 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhz7\" (UniqueName: \"kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7\") pod \"auto-csr-approver-29562034-g9whz\" (UID: \"922fb719-bcea-458f-9d6a-b10825eafaaa\") " pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.442895 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhz7\" (UniqueName: \"kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7\") pod \"auto-csr-approver-29562034-g9whz\" (UID: \"922fb719-bcea-458f-9d6a-b10825eafaaa\") " pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.490707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.495139 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.688896 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.753841 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.782640 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:34:00 crc kubenswrapper[4735]: I0317 04:34:00.858686 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.052718 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562034-g9whz"] Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.110038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562034-g9whz" event={"ID":"922fb719-bcea-458f-9d6a-b10825eafaaa","Type":"ContainerStarted","Data":"da2f0b6273bf5ae70cb71c0f79a754b0a3d6e71d9c5244b7d5bd1d4653bed622"} Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.208849 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-reloader/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.211511 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-frr-files/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.217165 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/cp-metrics/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.235996 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/controller/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.512571 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/frr-metrics/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.517343 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/kube-rbac-proxy/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.573887 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/kube-rbac-proxy-frr/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.722805 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/reloader/0.log" Mar 17 04:34:01 crc kubenswrapper[4735]: I0317 04:34:01.874749 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-c667b_665a221c-0d9f-4dfd-888a-fc7d5f09fbdb/frr-k8s-webhook-server/0.log" Mar 17 04:34:02 crc kubenswrapper[4735]: I0317 04:34:02.136246 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5bffff7ccd-ss6s5_81e7afb5-02be-49b0-bd12-39b2b2346a93/manager/0.log" Mar 17 04:34:02 crc kubenswrapper[4735]: I0317 04:34:02.861162 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b5d585c89-bjzst_1841e816-4298-4c01-8bfa-07273ea8dfff/webhook-server/0.log" Mar 17 04:34:02 crc kubenswrapper[4735]: I0317 04:34:02.953122 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-slwx6_3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e/kube-rbac-proxy/0.log" Mar 17 04:34:03 crc kubenswrapper[4735]: I0317 04:34:03.143343 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562034-g9whz" event={"ID":"922fb719-bcea-458f-9d6a-b10825eafaaa","Type":"ContainerStarted","Data":"af49b41de559d1acc44d0df22b7bf18385cc847ded95be3072a9bbf1b2c52d3c"} Mar 17 04:34:03 crc kubenswrapper[4735]: I0317 04:34:03.158980 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562034-g9whz" podStartSLOduration=2.009841354 podStartE2EDuration="3.158964454s" podCreationTimestamp="2026-03-17 04:34:00 +0000 UTC" firstStartedPulling="2026-03-17 04:34:01.065946178 +0000 UTC m=+12266.698179156" lastFinishedPulling="2026-03-17 04:34:02.215069278 +0000 UTC m=+12267.847302256" observedRunningTime="2026-03-17 04:34:03.157194272 +0000 UTC m=+12268.789427250" watchObservedRunningTime="2026-03-17 04:34:03.158964454 +0000 UTC m=+12268.791197432" Mar 17 04:34:03 crc kubenswrapper[4735]: I0317 04:34:03.834104 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-slwx6_3c5741a9-7da3-4e1c-ac5f-9c6f540c6f9e/speaker/0.log" Mar 17 04:34:04 crc kubenswrapper[4735]: I0317 04:34:04.167324 4735 generic.go:334] "Generic (PLEG): container finished" podID="922fb719-bcea-458f-9d6a-b10825eafaaa" containerID="af49b41de559d1acc44d0df22b7bf18385cc847ded95be3072a9bbf1b2c52d3c" exitCode=0 Mar 17 04:34:04 crc kubenswrapper[4735]: I0317 04:34:04.167748 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562034-g9whz" event={"ID":"922fb719-bcea-458f-9d6a-b10825eafaaa","Type":"ContainerDied","Data":"af49b41de559d1acc44d0df22b7bf18385cc847ded95be3072a9bbf1b2c52d3c"} Mar 17 04:34:04 crc kubenswrapper[4735]: I0317 04:34:04.562648 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfcw2_899c4d8c-fe75-4189-af67-c3edbd89d3fc/frr/0.log" Mar 17 04:34:05 crc kubenswrapper[4735]: I0317 04:34:05.597444 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:05 crc kubenswrapper[4735]: I0317 04:34:05.743215 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrhz7\" (UniqueName: \"kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7\") pod \"922fb719-bcea-458f-9d6a-b10825eafaaa\" (UID: \"922fb719-bcea-458f-9d6a-b10825eafaaa\") " Mar 17 04:34:05 crc kubenswrapper[4735]: I0317 04:34:05.753210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7" (OuterVolumeSpecName: "kube-api-access-vrhz7") pod "922fb719-bcea-458f-9d6a-b10825eafaaa" (UID: "922fb719-bcea-458f-9d6a-b10825eafaaa"). InnerVolumeSpecName "kube-api-access-vrhz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:34:05 crc kubenswrapper[4735]: I0317 04:34:05.845517 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrhz7\" (UniqueName: \"kubernetes.io/projected/922fb719-bcea-458f-9d6a-b10825eafaaa-kube-api-access-vrhz7\") on node \"crc\" DevicePath \"\"" Mar 17 04:34:06 crc kubenswrapper[4735]: I0317 04:34:06.186734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562034-g9whz" event={"ID":"922fb719-bcea-458f-9d6a-b10825eafaaa","Type":"ContainerDied","Data":"da2f0b6273bf5ae70cb71c0f79a754b0a3d6e71d9c5244b7d5bd1d4653bed622"} Mar 17 04:34:06 crc kubenswrapper[4735]: I0317 04:34:06.186783 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2f0b6273bf5ae70cb71c0f79a754b0a3d6e71d9c5244b7d5bd1d4653bed622" Mar 17 04:34:06 crc kubenswrapper[4735]: I0317 04:34:06.186803 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562034-g9whz" Mar 17 04:34:06 crc kubenswrapper[4735]: I0317 04:34:06.265273 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562028-lnt8b"] Mar 17 04:34:06 crc kubenswrapper[4735]: I0317 04:34:06.274072 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562028-lnt8b"] Mar 17 04:34:07 crc kubenswrapper[4735]: I0317 04:34:07.072793 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:34:07 crc kubenswrapper[4735]: E0317 04:34:07.073336 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:34:07 crc kubenswrapper[4735]: I0317 04:34:07.084966 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978340e2-e7dc-4ac5-a821-52b63f4605e4" path="/var/lib/kubelet/pods/978340e2-e7dc-4ac5-a821-52b63f4605e4/volumes" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.301427 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:09 crc kubenswrapper[4735]: E0317 04:34:09.301978 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922fb719-bcea-458f-9d6a-b10825eafaaa" containerName="oc" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.301998 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="922fb719-bcea-458f-9d6a-b10825eafaaa" containerName="oc" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.302230 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="922fb719-bcea-458f-9d6a-b10825eafaaa" containerName="oc" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.307263 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.315118 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.417693 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wqx\" (UniqueName: \"kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.418211 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.418280 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.520125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.520194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.520263 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wqx\" (UniqueName: \"kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.521225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.522029 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.539462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wqx\" (UniqueName: \"kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx\") pod \"certified-operators-822tf\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:09 crc kubenswrapper[4735]: I0317 04:34:09.642193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:10 crc kubenswrapper[4735]: I0317 04:34:10.143251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:10 crc kubenswrapper[4735]: I0317 04:34:10.242661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerStarted","Data":"c49136ab92389644dc8b3e2b9b5d7aa412f9d1bccafe661524379ae670ce66aa"} Mar 17 04:34:11 crc kubenswrapper[4735]: I0317 04:34:11.255111 4735 generic.go:334] "Generic (PLEG): container finished" podID="f88200fa-011b-4263-a46a-3116602478b8" containerID="8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56" exitCode=0 Mar 17 04:34:11 crc kubenswrapper[4735]: I0317 04:34:11.256284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerDied","Data":"8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56"} Mar 17 04:34:12 crc kubenswrapper[4735]: I0317 04:34:12.264685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerStarted","Data":"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b"} Mar 17 04:34:13 crc kubenswrapper[4735]: I0317 04:34:13.797716 4735 scope.go:117] "RemoveContainer" containerID="e95fded474adb6e72b028e29ca85979f7e5c51bd3b0025c8522d3065b0831c2a" Mar 17 04:34:14 crc kubenswrapper[4735]: I0317 04:34:14.283624 4735 generic.go:334] "Generic (PLEG): container finished" podID="f88200fa-011b-4263-a46a-3116602478b8" containerID="54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b" exitCode=0 Mar 17 04:34:14 crc kubenswrapper[4735]: I0317 04:34:14.283665 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerDied","Data":"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b"} Mar 17 04:34:15 crc kubenswrapper[4735]: I0317 04:34:15.293902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerStarted","Data":"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48"} Mar 17 04:34:15 crc kubenswrapper[4735]: I0317 04:34:15.320888 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-822tf" podStartSLOduration=2.572732722 podStartE2EDuration="6.320837525s" podCreationTimestamp="2026-03-17 04:34:09 +0000 UTC" firstStartedPulling="2026-03-17 04:34:11.258525134 +0000 UTC m=+12276.890758102" lastFinishedPulling="2026-03-17 04:34:15.006629917 +0000 UTC m=+12280.638862905" observedRunningTime="2026-03-17 04:34:15.315313963 +0000 UTC m=+12280.947546961" watchObservedRunningTime="2026-03-17 04:34:15.320837525 +0000 UTC m=+12280.953070503" Mar 17 04:34:18 crc kubenswrapper[4735]: I0317 04:34:18.074216 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:34:18 crc kubenswrapper[4735]: E0317 04:34:18.076409 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:34:18 crc kubenswrapper[4735]: I0317 04:34:18.648117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:34:18 crc kubenswrapper[4735]: I0317 04:34:18.821532 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:34:18 crc kubenswrapper[4735]: I0317 04:34:18.900823 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:34:18 crc kubenswrapper[4735]: I0317 04:34:18.979493 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.138274 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/pull/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.141501 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/util/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.208200 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kbsv7_237e71f6-eba3-4e8b-a219-18685f510184/extract/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.343974 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.540807 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.556265 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.578129 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.643111 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.643160 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.694094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.825288 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/util/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.836645 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/pull/0.log" Mar 17 04:34:19 crc kubenswrapper[4735]: I0317 04:34:19.958023 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17m2vg_0538ead5-a867-488c-819e-8ea63b6ad7ff/extract/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.309213 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-utilities/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.388398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.431463 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.622946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-utilities/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.645552 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-content/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.676872 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-content/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.823918 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-utilities/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.851648 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/extract-content/0.log" Mar 17 04:34:20 crc kubenswrapper[4735]: I0317 04:34:20.913454 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-822tf_f88200fa-011b-4263-a46a-3116602478b8/registry-server/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.061049 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.290787 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.300003 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.309163 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.609337 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-content/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.613003 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/extract-utilities/0.log" Mar 17 04:34:21 crc kubenswrapper[4735]: I0317 04:34:21.818911 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.119228 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.210386 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.241798 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.357648 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-822tf" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="registry-server" containerID="cri-o://9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48" gracePeriod=2 Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.470301 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-utilities/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.623056 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/extract-content/0.log" Mar 17 04:34:22 crc kubenswrapper[4735]: I0317 04:34:22.920025 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.019533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content\") pod \"f88200fa-011b-4263-a46a-3116602478b8\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.019683 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities\") pod \"f88200fa-011b-4263-a46a-3116602478b8\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.019717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6wqx\" (UniqueName: \"kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx\") pod \"f88200fa-011b-4263-a46a-3116602478b8\" (UID: \"f88200fa-011b-4263-a46a-3116602478b8\") " Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.020433 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities" (OuterVolumeSpecName: "utilities") pod "f88200fa-011b-4263-a46a-3116602478b8" (UID: "f88200fa-011b-4263-a46a-3116602478b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.020849 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.024056 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qkds_f56c71b2-cf3c-4fe1-8c13-fd905c5a623d/marketplace-operator/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.031743 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx" (OuterVolumeSpecName: "kube-api-access-k6wqx") pod "f88200fa-011b-4263-a46a-3116602478b8" (UID: "f88200fa-011b-4263-a46a-3116602478b8"). InnerVolumeSpecName "kube-api-access-k6wqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.071553 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f88200fa-011b-4263-a46a-3116602478b8" (UID: "f88200fa-011b-4263-a46a-3116602478b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.123142 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88200fa-011b-4263-a46a-3116602478b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.123199 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6wqx\" (UniqueName: \"kubernetes.io/projected/f88200fa-011b-4263-a46a-3116602478b8-kube-api-access-k6wqx\") on node \"crc\" DevicePath \"\"" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.303038 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kjm9t_a217fdf0-6e31-4c78-a859-428945d1067e/registry-server/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.336558 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.362130 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qpdb8_bf62916f-f358-4caa-9c7e-37527c4610d3/registry-server/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.366743 4735 generic.go:334] "Generic (PLEG): container finished" podID="f88200fa-011b-4263-a46a-3116602478b8" containerID="9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48" exitCode=0 Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.366800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerDied","Data":"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48"} Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.366878 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-822tf" event={"ID":"f88200fa-011b-4263-a46a-3116602478b8","Type":"ContainerDied","Data":"c49136ab92389644dc8b3e2b9b5d7aa412f9d1bccafe661524379ae670ce66aa"} Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.366903 4735 scope.go:117] "RemoveContainer" containerID="9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.366814 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-822tf" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.407252 4735 scope.go:117] "RemoveContainer" containerID="54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.409669 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.465828 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-822tf"] Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.475444 4735 scope.go:117] "RemoveContainer" containerID="8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.503311 4735 scope.go:117] "RemoveContainer" containerID="9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48" Mar 17 04:34:23 crc kubenswrapper[4735]: E0317 04:34:23.510069 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48\": container with ID starting with 9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48 not found: ID does not exist" containerID="9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.510110 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.510111 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48"} err="failed to get container status \"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48\": rpc error: code = NotFound desc = could not find container \"9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48\": container with ID starting with 9342101f53eba4f260cb6e0037e68ffb9888f5f0601689181f27a7785860db48 not found: ID does not exist" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.510156 4735 scope.go:117] "RemoveContainer" containerID="54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b" Mar 17 04:34:23 crc kubenswrapper[4735]: E0317 04:34:23.513533 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b\": container with ID starting with 54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b not found: ID does not exist" containerID="54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.513564 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b"} err="failed to get container status \"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b\": rpc error: code = NotFound desc = could not find container \"54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b\": container with ID starting with 54f524c8a0f3c9ac8dce0cbf48ca43aaa9753a4cd509e56aaa36eed49a412b1b not found: ID does not exist" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.513586 4735 scope.go:117] "RemoveContainer" containerID="8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56" Mar 17 04:34:23 crc kubenswrapper[4735]: E0317 04:34:23.514231 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56\": container with ID starting with 8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56 not found: ID does not exist" containerID="8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.514287 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56"} err="failed to get container status \"8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56\": rpc error: code = NotFound desc = could not find container \"8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56\": container with ID starting with 8e982761d20df96d84533ca174b5659213f2282cb58566e1c4f3e1b9cb6a8d56 not found: ID does not exist" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.618831 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.624937 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.767213 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-utilities/0.log" Mar 17 04:34:23 crc kubenswrapper[4735]: I0317 04:34:23.768041 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/extract-content/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.018556 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.096708 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7k4g_93613795-fcb8-40c7-a9d7-5001de165a69/registry-server/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.145435 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.162471 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.186458 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.390088 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-utilities/0.log" Mar 17 04:34:24 crc kubenswrapper[4735]: I0317 04:34:24.426114 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/extract-content/0.log" Mar 17 04:34:25 crc kubenswrapper[4735]: I0317 04:34:25.083987 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88200fa-011b-4263-a46a-3116602478b8" path="/var/lib/kubelet/pods/f88200fa-011b-4263-a46a-3116602478b8/volumes" Mar 17 04:34:25 crc kubenswrapper[4735]: I0317 04:34:25.733287 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p6nl7_8ce6a4e4-dae9-4138-871c-c9c3c05641e6/registry-server/0.log" Mar 17 04:34:32 crc kubenswrapper[4735]: I0317 04:34:32.073164 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:34:32 crc kubenswrapper[4735]: E0317 04:34:32.073923 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:34:47 crc kubenswrapper[4735]: I0317 04:34:47.074078 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:34:47 crc kubenswrapper[4735]: E0317 04:34:47.075600 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:34:53 crc kubenswrapper[4735]: E0317 04:34:53.453937 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:34814->38.102.83.65:40841: write tcp 38.102.83.65:34814->38.102.83.65:40841: write: broken pipe Mar 17 04:35:00 crc kubenswrapper[4735]: I0317 04:35:00.073597 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:35:00 crc kubenswrapper[4735]: E0317 04:35:00.074258 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:35:11 crc kubenswrapper[4735]: I0317 04:35:11.073219 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:35:11 crc kubenswrapper[4735]: E0317 04:35:11.074174 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:35:23 crc kubenswrapper[4735]: I0317 04:35:23.073427 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:35:23 crc kubenswrapper[4735]: E0317 04:35:23.074444 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:35:36 crc kubenswrapper[4735]: I0317 04:35:36.073973 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:35:36 crc kubenswrapper[4735]: E0317 04:35:36.075471 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:35:51 crc kubenswrapper[4735]: I0317 04:35:51.073576 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:35:51 crc kubenswrapper[4735]: E0317 04:35:51.074404 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.175925 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562036-56tjk"] Mar 17 04:36:00 crc kubenswrapper[4735]: E0317 04:36:00.176985 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="extract-content" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.177000 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="extract-content" Mar 17 04:36:00 crc kubenswrapper[4735]: E0317 04:36:00.177028 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="registry-server" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.177037 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="registry-server" Mar 17 04:36:00 crc kubenswrapper[4735]: E0317 04:36:00.177053 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="extract-utilities" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.177064 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="extract-utilities" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.177298 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f88200fa-011b-4263-a46a-3116602478b8" containerName="registry-server" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.178077 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.180229 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.180627 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.181363 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.189457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562036-56tjk"] Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.248232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5bn\" (UniqueName: \"kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn\") pod \"auto-csr-approver-29562036-56tjk\" (UID: \"4a6893e1-23db-4ead-b757-2f426a182d8e\") " pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.350785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5bn\" (UniqueName: \"kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn\") pod \"auto-csr-approver-29562036-56tjk\" (UID: \"4a6893e1-23db-4ead-b757-2f426a182d8e\") " pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.383914 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5bn\" (UniqueName: \"kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn\") pod \"auto-csr-approver-29562036-56tjk\" (UID: \"4a6893e1-23db-4ead-b757-2f426a182d8e\") " pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:00 crc kubenswrapper[4735]: I0317 04:36:00.494619 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:01 crc kubenswrapper[4735]: I0317 04:36:01.257046 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562036-56tjk"] Mar 17 04:36:01 crc kubenswrapper[4735]: I0317 04:36:01.341543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562036-56tjk" event={"ID":"4a6893e1-23db-4ead-b757-2f426a182d8e","Type":"ContainerStarted","Data":"d96510debe0f074b94c8e6b04687212753afb25957fd720ab20d065cbe5e2212"} Mar 17 04:36:03 crc kubenswrapper[4735]: I0317 04:36:03.361979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562036-56tjk" event={"ID":"4a6893e1-23db-4ead-b757-2f426a182d8e","Type":"ContainerStarted","Data":"772cd870e53f5ab88e7194b9214d403e50d0700a8012e6615a3209bfad7e6c6c"} Mar 17 04:36:03 crc kubenswrapper[4735]: I0317 04:36:03.384128 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562036-56tjk" podStartSLOduration=2.527207928 podStartE2EDuration="3.384100388s" podCreationTimestamp="2026-03-17 04:36:00 +0000 UTC" firstStartedPulling="2026-03-17 04:36:01.280250721 +0000 UTC m=+12386.912483699" lastFinishedPulling="2026-03-17 04:36:02.137143181 +0000 UTC m=+12387.769376159" observedRunningTime="2026-03-17 04:36:03.382457128 +0000 UTC m=+12389.014690136" watchObservedRunningTime="2026-03-17 04:36:03.384100388 +0000 UTC m=+12389.016333396" Mar 17 04:36:04 crc kubenswrapper[4735]: I0317 04:36:04.371954 4735 generic.go:334] "Generic (PLEG): container finished" podID="4a6893e1-23db-4ead-b757-2f426a182d8e" containerID="772cd870e53f5ab88e7194b9214d403e50d0700a8012e6615a3209bfad7e6c6c" exitCode=0 Mar 17 04:36:04 crc kubenswrapper[4735]: I0317 04:36:04.372156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562036-56tjk" event={"ID":"4a6893e1-23db-4ead-b757-2f426a182d8e","Type":"ContainerDied","Data":"772cd870e53f5ab88e7194b9214d403e50d0700a8012e6615a3209bfad7e6c6c"} Mar 17 04:36:05 crc kubenswrapper[4735]: I0317 04:36:05.090913 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:36:05 crc kubenswrapper[4735]: E0317 04:36:05.091620 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:36:05 crc kubenswrapper[4735]: I0317 04:36:05.782886 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:05 crc kubenswrapper[4735]: I0317 04:36:05.865935 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5bn\" (UniqueName: \"kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn\") pod \"4a6893e1-23db-4ead-b757-2f426a182d8e\" (UID: \"4a6893e1-23db-4ead-b757-2f426a182d8e\") " Mar 17 04:36:05 crc kubenswrapper[4735]: I0317 04:36:05.872398 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn" (OuterVolumeSpecName: "kube-api-access-wb5bn") pod "4a6893e1-23db-4ead-b757-2f426a182d8e" (UID: "4a6893e1-23db-4ead-b757-2f426a182d8e"). InnerVolumeSpecName "kube-api-access-wb5bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:36:05 crc kubenswrapper[4735]: I0317 04:36:05.967115 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5bn\" (UniqueName: \"kubernetes.io/projected/4a6893e1-23db-4ead-b757-2f426a182d8e-kube-api-access-wb5bn\") on node \"crc\" DevicePath \"\"" Mar 17 04:36:06 crc kubenswrapper[4735]: I0317 04:36:06.449988 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562036-56tjk" event={"ID":"4a6893e1-23db-4ead-b757-2f426a182d8e","Type":"ContainerDied","Data":"d96510debe0f074b94c8e6b04687212753afb25957fd720ab20d065cbe5e2212"} Mar 17 04:36:06 crc kubenswrapper[4735]: I0317 04:36:06.450048 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96510debe0f074b94c8e6b04687212753afb25957fd720ab20d065cbe5e2212" Mar 17 04:36:06 crc kubenswrapper[4735]: I0317 04:36:06.450156 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562036-56tjk" Mar 17 04:36:06 crc kubenswrapper[4735]: I0317 04:36:06.477125 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562030-55hk5"] Mar 17 04:36:06 crc kubenswrapper[4735]: I0317 04:36:06.488391 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562030-55hk5"] Mar 17 04:36:07 crc kubenswrapper[4735]: I0317 04:36:07.083468 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc73e95f-ca55-44df-9c4c-6f9a8b412cba" path="/var/lib/kubelet/pods/dc73e95f-ca55-44df-9c4c-6f9a8b412cba/volumes" Mar 17 04:36:13 crc kubenswrapper[4735]: I0317 04:36:13.997311 4735 scope.go:117] "RemoveContainer" containerID="f1101de629d65852e35adea7d6ede2a6d27e1f1a9e32f7ddc2c2c589a77abacd" Mar 17 04:36:14 crc kubenswrapper[4735]: I0317 04:36:14.032085 4735 scope.go:117] "RemoveContainer" containerID="296d588bd003825b43bf93956e7e02e0531959b0fff1f2433570de8ebc9bceeb" Mar 17 04:36:16 crc kubenswrapper[4735]: I0317 04:36:16.073129 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:36:16 crc kubenswrapper[4735]: E0317 04:36:16.073964 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:36:31 crc kubenswrapper[4735]: I0317 04:36:31.073217 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:36:31 crc kubenswrapper[4735]: E0317 04:36:31.074418 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z669m_openshift-machine-config-operator(0fe43c53-58a2-4450-a71c-667e10384678)\"" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" Mar 17 04:36:43 crc kubenswrapper[4735]: I0317 04:36:43.072781 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:36:43 crc kubenswrapper[4735]: I0317 04:36:43.847363 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"59bdc573ea560fbc2dceb95df3b5c4b658d709ef6478e78c1a741d04d2c90abc"} Mar 17 04:37:02 crc kubenswrapper[4735]: I0317 04:37:02.003967 4735 generic.go:334] "Generic (PLEG): container finished" podID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerID="e853d67ba624a1fb26f035cf6ee0dc61191ecf7e1c9e2833068a9a8ca04c52ff" exitCode=0 Mar 17 04:37:02 crc kubenswrapper[4735]: I0317 04:37:02.004148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4kfvn/must-gather-nsckr" event={"ID":"b8b4da6b-74ed-4bef-920b-8352db585ac6","Type":"ContainerDied","Data":"e853d67ba624a1fb26f035cf6ee0dc61191ecf7e1c9e2833068a9a8ca04c52ff"} Mar 17 04:37:02 crc kubenswrapper[4735]: I0317 04:37:02.005719 4735 scope.go:117] "RemoveContainer" containerID="e853d67ba624a1fb26f035cf6ee0dc61191ecf7e1c9e2833068a9a8ca04c52ff" Mar 17 04:37:02 crc kubenswrapper[4735]: I0317 04:37:02.719421 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4kfvn_must-gather-nsckr_b8b4da6b-74ed-4bef-920b-8352db585ac6/gather/0.log" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.925488 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:06 crc kubenswrapper[4735]: E0317 04:37:06.926392 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6893e1-23db-4ead-b757-2f426a182d8e" containerName="oc" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.926408 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6893e1-23db-4ead-b757-2f426a182d8e" containerName="oc" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.926651 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6893e1-23db-4ead-b757-2f426a182d8e" containerName="oc" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.928640 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.945884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.945980 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.946030 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k9kr\" (UniqueName: \"kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:06 crc kubenswrapper[4735]: I0317 04:37:06.979265 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.047880 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.047943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.047970 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k9kr\" (UniqueName: \"kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.048554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.048586 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.071738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k9kr\" (UniqueName: \"kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr\") pod \"redhat-operators-8427h\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.251578 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:07 crc kubenswrapper[4735]: I0317 04:37:07.759427 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:08 crc kubenswrapper[4735]: I0317 04:37:08.077782 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerID="0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae" exitCode=0 Mar 17 04:37:08 crc kubenswrapper[4735]: I0317 04:37:08.078086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerDied","Data":"0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae"} Mar 17 04:37:08 crc kubenswrapper[4735]: I0317 04:37:08.078117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerStarted","Data":"dfbda01adc7194f34c3c8ec305bf60e68f5d642f9e0b3b3f032bd95c71485abb"} Mar 17 04:37:08 crc kubenswrapper[4735]: I0317 04:37:08.080122 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 04:37:09 crc kubenswrapper[4735]: I0317 04:37:09.093218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerStarted","Data":"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a"} Mar 17 04:37:14 crc kubenswrapper[4735]: I0317 04:37:14.138725 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerID="55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a" exitCode=0 Mar 17 04:37:14 crc kubenswrapper[4735]: I0317 04:37:14.138801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerDied","Data":"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a"} Mar 17 04:37:14 crc kubenswrapper[4735]: I0317 04:37:14.183009 4735 scope.go:117] "RemoveContainer" containerID="ac596df3d531cc4da8b0de57384a500dce8b0a3a8eacd182eb072ca32a498b59" Mar 17 04:37:15 crc kubenswrapper[4735]: I0317 04:37:15.152013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerStarted","Data":"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77"} Mar 17 04:37:15 crc kubenswrapper[4735]: I0317 04:37:15.174631 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8427h" podStartSLOduration=2.577341611 podStartE2EDuration="9.174611984s" podCreationTimestamp="2026-03-17 04:37:06 +0000 UTC" firstStartedPulling="2026-03-17 04:37:08.079830866 +0000 UTC m=+12453.712063854" lastFinishedPulling="2026-03-17 04:37:14.677101249 +0000 UTC m=+12460.309334227" observedRunningTime="2026-03-17 04:37:15.169223694 +0000 UTC m=+12460.801456672" watchObservedRunningTime="2026-03-17 04:37:15.174611984 +0000 UTC m=+12460.806844962" Mar 17 04:37:17 crc kubenswrapper[4735]: I0317 04:37:17.252769 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:17 crc kubenswrapper[4735]: I0317 04:37:17.253164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:18 crc kubenswrapper[4735]: I0317 04:37:18.334376 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8427h" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" probeResult="failure" output=< Mar 17 04:37:18 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:37:18 crc kubenswrapper[4735]: > Mar 17 04:37:21 crc kubenswrapper[4735]: I0317 04:37:21.745036 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4kfvn/must-gather-nsckr"] Mar 17 04:37:21 crc kubenswrapper[4735]: I0317 04:37:21.746316 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4kfvn/must-gather-nsckr" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="copy" containerID="cri-o://2f557cb8adf5b0ffee88ce42452ac554fcc2abd78879695473d5d91e333b7f12" gracePeriod=2 Mar 17 04:37:21 crc kubenswrapper[4735]: I0317 04:37:21.756523 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4kfvn/must-gather-nsckr"] Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.220285 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4kfvn_must-gather-nsckr_b8b4da6b-74ed-4bef-920b-8352db585ac6/copy/0.log" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.222624 4735 generic.go:334] "Generic (PLEG): container finished" podID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerID="2f557cb8adf5b0ffee88ce42452ac554fcc2abd78879695473d5d91e333b7f12" exitCode=143 Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.341123 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4kfvn_must-gather-nsckr_b8b4da6b-74ed-4bef-920b-8352db585ac6/copy/0.log" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.343032 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.479738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output\") pod \"b8b4da6b-74ed-4bef-920b-8352db585ac6\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.479944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbhvs\" (UniqueName: \"kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs\") pod \"b8b4da6b-74ed-4bef-920b-8352db585ac6\" (UID: \"b8b4da6b-74ed-4bef-920b-8352db585ac6\") " Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.487022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs" (OuterVolumeSpecName: "kube-api-access-qbhvs") pod "b8b4da6b-74ed-4bef-920b-8352db585ac6" (UID: "b8b4da6b-74ed-4bef-920b-8352db585ac6"). InnerVolumeSpecName "kube-api-access-qbhvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.583343 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbhvs\" (UniqueName: \"kubernetes.io/projected/b8b4da6b-74ed-4bef-920b-8352db585ac6-kube-api-access-qbhvs\") on node \"crc\" DevicePath \"\"" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.602132 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b8b4da6b-74ed-4bef-920b-8352db585ac6" (UID: "b8b4da6b-74ed-4bef-920b-8352db585ac6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:37:22 crc kubenswrapper[4735]: I0317 04:37:22.685047 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8b4da6b-74ed-4bef-920b-8352db585ac6-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 04:37:23 crc kubenswrapper[4735]: I0317 04:37:23.083464 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" path="/var/lib/kubelet/pods/b8b4da6b-74ed-4bef-920b-8352db585ac6/volumes" Mar 17 04:37:23 crc kubenswrapper[4735]: I0317 04:37:23.231590 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4kfvn_must-gather-nsckr_b8b4da6b-74ed-4bef-920b-8352db585ac6/copy/0.log" Mar 17 04:37:23 crc kubenswrapper[4735]: I0317 04:37:23.232203 4735 scope.go:117] "RemoveContainer" containerID="2f557cb8adf5b0ffee88ce42452ac554fcc2abd78879695473d5d91e333b7f12" Mar 17 04:37:23 crc kubenswrapper[4735]: I0317 04:37:23.232296 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4kfvn/must-gather-nsckr" Mar 17 04:37:23 crc kubenswrapper[4735]: I0317 04:37:23.255550 4735 scope.go:117] "RemoveContainer" containerID="e853d67ba624a1fb26f035cf6ee0dc61191ecf7e1c9e2833068a9a8ca04c52ff" Mar 17 04:37:28 crc kubenswrapper[4735]: I0317 04:37:28.301098 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8427h" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" probeResult="failure" output=< Mar 17 04:37:28 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:37:28 crc kubenswrapper[4735]: > Mar 17 04:37:38 crc kubenswrapper[4735]: I0317 04:37:38.305886 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8427h" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" probeResult="failure" output=< Mar 17 04:37:38 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:37:38 crc kubenswrapper[4735]: > Mar 17 04:37:48 crc kubenswrapper[4735]: I0317 04:37:48.321489 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8427h" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" probeResult="failure" output=< Mar 17 04:37:48 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Mar 17 04:37:48 crc kubenswrapper[4735]: > Mar 17 04:37:57 crc kubenswrapper[4735]: I0317 04:37:57.326460 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:57 crc kubenswrapper[4735]: I0317 04:37:57.396212 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:57 crc kubenswrapper[4735]: I0317 04:37:57.581431 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:58 crc kubenswrapper[4735]: I0317 04:37:58.575437 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8427h" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" containerID="cri-o://923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77" gracePeriod=2 Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.220473 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.314370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content\") pod \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.314569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities\") pod \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.314666 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k9kr\" (UniqueName: \"kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr\") pod \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\" (UID: \"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57\") " Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.315162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities" (OuterVolumeSpecName: "utilities") pod "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" (UID: "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.315654 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.338012 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr" (OuterVolumeSpecName: "kube-api-access-2k9kr") pod "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" (UID: "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57"). InnerVolumeSpecName "kube-api-access-2k9kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.417046 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k9kr\" (UniqueName: \"kubernetes.io/projected/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-kube-api-access-2k9kr\") on node \"crc\" DevicePath \"\"" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.471238 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" (UID: "ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.519114 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.588559 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerID="923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77" exitCode=0 Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.588601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerDied","Data":"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77"} Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.588628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8427h" event={"ID":"ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57","Type":"ContainerDied","Data":"dfbda01adc7194f34c3c8ec305bf60e68f5d642f9e0b3b3f032bd95c71485abb"} Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.588648 4735 scope.go:117] "RemoveContainer" containerID="923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.590070 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8427h" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.613647 4735 scope.go:117] "RemoveContainer" containerID="55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.621854 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.632717 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8427h"] Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.649617 4735 scope.go:117] "RemoveContainer" containerID="0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.672990 4735 scope.go:117] "RemoveContainer" containerID="923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77" Mar 17 04:37:59 crc kubenswrapper[4735]: E0317 04:37:59.676243 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77\": container with ID starting with 923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77 not found: ID does not exist" containerID="923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.676277 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77"} err="failed to get container status \"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77\": rpc error: code = NotFound desc = could not find container \"923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77\": container with ID starting with 923e7848d3e293b27aebbc42d1e907cc0dcdda31fea3449f8fc629d09e473b77 not found: ID does not exist" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.676300 4735 scope.go:117] "RemoveContainer" containerID="55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a" Mar 17 04:37:59 crc kubenswrapper[4735]: E0317 04:37:59.676899 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a\": container with ID starting with 55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a not found: ID does not exist" containerID="55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.677033 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a"} err="failed to get container status \"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a\": rpc error: code = NotFound desc = could not find container \"55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a\": container with ID starting with 55f66508926dc26f5a6b5c2236abb44c100eab5277bed244195c02f8e90d814a not found: ID does not exist" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.677130 4735 scope.go:117] "RemoveContainer" containerID="0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae" Mar 17 04:37:59 crc kubenswrapper[4735]: E0317 04:37:59.677604 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae\": container with ID starting with 0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae not found: ID does not exist" containerID="0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae" Mar 17 04:37:59 crc kubenswrapper[4735]: I0317 04:37:59.677649 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae"} err="failed to get container status \"0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae\": rpc error: code = NotFound desc = could not find container \"0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae\": container with ID starting with 0405333c2caa4c2ffe659fe8665c31f728b74f336cdbc6960cdcb8782aac92ae not found: ID does not exist" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.155877 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562038-dtd5m"] Mar 17 04:38:00 crc kubenswrapper[4735]: E0317 04:38:00.156656 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="extract-utilities" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.156676 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="extract-utilities" Mar 17 04:38:00 crc kubenswrapper[4735]: E0317 04:38:00.156693 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="copy" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.156702 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="copy" Mar 17 04:38:00 crc kubenswrapper[4735]: E0317 04:38:00.156723 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="extract-content" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.156731 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="extract-content" Mar 17 04:38:00 crc kubenswrapper[4735]: E0317 04:38:00.156743 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.156749 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" Mar 17 04:38:00 crc kubenswrapper[4735]: E0317 04:38:00.156775 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="gather" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.156781 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="gather" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.168287 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="copy" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.168334 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" containerName="registry-server" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.168381 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b4da6b-74ed-4bef-920b-8352db585ac6" containerName="gather" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.170399 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562038-dtd5m"] Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.170527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.173394 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.173889 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.176491 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.249462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp498\" (UniqueName: \"kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498\") pod \"auto-csr-approver-29562038-dtd5m\" (UID: \"e17d61e6-f29c-4394-9175-44dad78f5698\") " pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.350935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp498\" (UniqueName: \"kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498\") pod \"auto-csr-approver-29562038-dtd5m\" (UID: \"e17d61e6-f29c-4394-9175-44dad78f5698\") " pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.375743 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp498\" (UniqueName: \"kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498\") pod \"auto-csr-approver-29562038-dtd5m\" (UID: \"e17d61e6-f29c-4394-9175-44dad78f5698\") " pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.504602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:00 crc kubenswrapper[4735]: I0317 04:38:00.876978 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562038-dtd5m"] Mar 17 04:38:01 crc kubenswrapper[4735]: I0317 04:38:01.089265 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57" path="/var/lib/kubelet/pods/ef5a0c9b-2eab-4b9b-9557-29b1fb9c4d57/volumes" Mar 17 04:38:01 crc kubenswrapper[4735]: I0317 04:38:01.616080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" event={"ID":"e17d61e6-f29c-4394-9175-44dad78f5698","Type":"ContainerStarted","Data":"e983b1b13253a40ffbeefc676ef5b02bfa06672f21ef914bc150f37c84b70eb2"} Mar 17 04:38:02 crc kubenswrapper[4735]: I0317 04:38:02.626281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" event={"ID":"e17d61e6-f29c-4394-9175-44dad78f5698","Type":"ContainerStarted","Data":"47a67ae2ba478936eb38962bd91519512c26d0095357f65794d410c8100a633d"} Mar 17 04:38:02 crc kubenswrapper[4735]: I0317 04:38:02.639282 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" podStartSLOduration=1.270147407 podStartE2EDuration="2.639252345s" podCreationTimestamp="2026-03-17 04:38:00 +0000 UTC" firstStartedPulling="2026-03-17 04:38:00.870135072 +0000 UTC m=+12506.502368050" lastFinishedPulling="2026-03-17 04:38:02.23924001 +0000 UTC m=+12507.871472988" observedRunningTime="2026-03-17 04:38:02.638946958 +0000 UTC m=+12508.271179936" watchObservedRunningTime="2026-03-17 04:38:02.639252345 +0000 UTC m=+12508.271485363" Mar 17 04:38:03 crc kubenswrapper[4735]: I0317 04:38:03.638065 4735 generic.go:334] "Generic (PLEG): container finished" podID="e17d61e6-f29c-4394-9175-44dad78f5698" containerID="47a67ae2ba478936eb38962bd91519512c26d0095357f65794d410c8100a633d" exitCode=0 Mar 17 04:38:03 crc kubenswrapper[4735]: I0317 04:38:03.638168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" event={"ID":"e17d61e6-f29c-4394-9175-44dad78f5698","Type":"ContainerDied","Data":"47a67ae2ba478936eb38962bd91519512c26d0095357f65794d410c8100a633d"} Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.088241 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.111844 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp498\" (UniqueName: \"kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498\") pod \"e17d61e6-f29c-4394-9175-44dad78f5698\" (UID: \"e17d61e6-f29c-4394-9175-44dad78f5698\") " Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.153349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498" (OuterVolumeSpecName: "kube-api-access-cp498") pod "e17d61e6-f29c-4394-9175-44dad78f5698" (UID: "e17d61e6-f29c-4394-9175-44dad78f5698"). InnerVolumeSpecName "kube-api-access-cp498". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.214892 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp498\" (UniqueName: \"kubernetes.io/projected/e17d61e6-f29c-4394-9175-44dad78f5698-kube-api-access-cp498\") on node \"crc\" DevicePath \"\"" Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.676387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" event={"ID":"e17d61e6-f29c-4394-9175-44dad78f5698","Type":"ContainerDied","Data":"e983b1b13253a40ffbeefc676ef5b02bfa06672f21ef914bc150f37c84b70eb2"} Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.676461 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e983b1b13253a40ffbeefc676ef5b02bfa06672f21ef914bc150f37c84b70eb2" Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.676542 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562038-dtd5m" Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.719461 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562032-rqxmk"] Mar 17 04:38:05 crc kubenswrapper[4735]: I0317 04:38:05.730721 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562032-rqxmk"] Mar 17 04:38:07 crc kubenswrapper[4735]: I0317 04:38:07.084836 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b783d962-043c-482f-b478-df8868d7672e" path="/var/lib/kubelet/pods/b783d962-043c-482f-b478-df8868d7672e/volumes" Mar 17 04:38:14 crc kubenswrapper[4735]: I0317 04:38:14.290965 4735 scope.go:117] "RemoveContainer" containerID="393c522f0407555dd2b05b8b167083d78b0e3822dad196a10cb188c2583accdd" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.428735 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:12 crc kubenswrapper[4735]: E0317 04:39:12.437122 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d61e6-f29c-4394-9175-44dad78f5698" containerName="oc" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.437158 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d61e6-f29c-4394-9175-44dad78f5698" containerName="oc" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.437470 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17d61e6-f29c-4394-9175-44dad78f5698" containerName="oc" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.443745 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.455688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.597430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4vw\" (UniqueName: \"kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.597536 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.597627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.606359 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.607274 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.700009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.700531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.700794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.701198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.701295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4vw\" (UniqueName: \"kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.723228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4vw\" (UniqueName: \"kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw\") pod \"redhat-marketplace-8gkqb\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:12 crc kubenswrapper[4735]: I0317 04:39:12.765510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:13 crc kubenswrapper[4735]: I0317 04:39:13.061988 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:13 crc kubenswrapper[4735]: I0317 04:39:13.516908 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerID="7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5" exitCode=0 Mar 17 04:39:13 crc kubenswrapper[4735]: I0317 04:39:13.517657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerDied","Data":"7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5"} Mar 17 04:39:13 crc kubenswrapper[4735]: I0317 04:39:13.519343 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerStarted","Data":"e8021a7a26fbb92f58e04f924a2a15f51b88c289466f8fd479a0b1a631da0fd5"} Mar 17 04:39:15 crc kubenswrapper[4735]: I0317 04:39:15.543603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerStarted","Data":"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f"} Mar 17 04:39:16 crc kubenswrapper[4735]: I0317 04:39:16.555490 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerID="63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f" exitCode=0 Mar 17 04:39:16 crc kubenswrapper[4735]: I0317 04:39:16.555598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerDied","Data":"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f"} Mar 17 04:39:17 crc kubenswrapper[4735]: I0317 04:39:17.567261 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerStarted","Data":"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6"} Mar 17 04:39:17 crc kubenswrapper[4735]: I0317 04:39:17.591063 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8gkqb" podStartSLOduration=2.038135688 podStartE2EDuration="5.59103942s" podCreationTimestamp="2026-03-17 04:39:12 +0000 UTC" firstStartedPulling="2026-03-17 04:39:13.519708421 +0000 UTC m=+12579.151941409" lastFinishedPulling="2026-03-17 04:39:17.072612133 +0000 UTC m=+12582.704845141" observedRunningTime="2026-03-17 04:39:17.585510817 +0000 UTC m=+12583.217743795" watchObservedRunningTime="2026-03-17 04:39:17.59103942 +0000 UTC m=+12583.223272408" Mar 17 04:39:22 crc kubenswrapper[4735]: I0317 04:39:22.766179 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:22 crc kubenswrapper[4735]: I0317 04:39:22.766606 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:22 crc kubenswrapper[4735]: I0317 04:39:22.833666 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:23 crc kubenswrapper[4735]: I0317 04:39:23.723016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:23 crc kubenswrapper[4735]: I0317 04:39:23.804843 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:25 crc kubenswrapper[4735]: I0317 04:39:25.646605 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8gkqb" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="registry-server" containerID="cri-o://c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6" gracePeriod=2 Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.210494 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.336709 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4vw\" (UniqueName: \"kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw\") pod \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.337042 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content\") pod \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.337085 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities\") pod \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\" (UID: \"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca\") " Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.338295 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities" (OuterVolumeSpecName: "utilities") pod "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" (UID: "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.343581 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw" (OuterVolumeSpecName: "kube-api-access-nc4vw") pod "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" (UID: "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca"). InnerVolumeSpecName "kube-api-access-nc4vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.370914 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" (UID: "3fc7cfcd-7248-4da3-b0d4-20e8f556ffca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.440605 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4vw\" (UniqueName: \"kubernetes.io/projected/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-kube-api-access-nc4vw\") on node \"crc\" DevicePath \"\"" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.440657 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.440684 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.657849 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerID="c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6" exitCode=0 Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.657959 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerDied","Data":"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6"} Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.657990 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkqb" event={"ID":"3fc7cfcd-7248-4da3-b0d4-20e8f556ffca","Type":"ContainerDied","Data":"e8021a7a26fbb92f58e04f924a2a15f51b88c289466f8fd479a0b1a631da0fd5"} Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.657987 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkqb" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.658006 4735 scope.go:117] "RemoveContainer" containerID="c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.680533 4735 scope.go:117] "RemoveContainer" containerID="63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.715434 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.718667 4735 scope.go:117] "RemoveContainer" containerID="7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.730911 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkqb"] Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.749460 4735 scope.go:117] "RemoveContainer" containerID="c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6" Mar 17 04:39:26 crc kubenswrapper[4735]: E0317 04:39:26.750056 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6\": container with ID starting with c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6 not found: ID does not exist" containerID="c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.750112 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6"} err="failed to get container status \"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6\": rpc error: code = NotFound desc = could not find container \"c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6\": container with ID starting with c2915b65cde736f8fa554b35fd2612d3fbcd7db0548ad940cb02fbfbf5bc7fa6 not found: ID does not exist" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.750144 4735 scope.go:117] "RemoveContainer" containerID="63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f" Mar 17 04:39:26 crc kubenswrapper[4735]: E0317 04:39:26.750447 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f\": container with ID starting with 63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f not found: ID does not exist" containerID="63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.750485 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f"} err="failed to get container status \"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f\": rpc error: code = NotFound desc = could not find container \"63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f\": container with ID starting with 63b2d1f4dddca7061fffdc7eec0fb41e6305f0e1e7db0ecb2a01714241c5df7f not found: ID does not exist" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.750510 4735 scope.go:117] "RemoveContainer" containerID="7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5" Mar 17 04:39:26 crc kubenswrapper[4735]: E0317 04:39:26.750827 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5\": container with ID starting with 7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5 not found: ID does not exist" containerID="7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5" Mar 17 04:39:26 crc kubenswrapper[4735]: I0317 04:39:26.750876 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5"} err="failed to get container status \"7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5\": rpc error: code = NotFound desc = could not find container \"7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5\": container with ID starting with 7061f1a0c98ca6a9dd9c20505151fbbba350861b459bdb593f29556b9db4e7d5 not found: ID does not exist" Mar 17 04:39:27 crc kubenswrapper[4735]: I0317 04:39:27.088223 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" path="/var/lib/kubelet/pods/3fc7cfcd-7248-4da3-b0d4-20e8f556ffca/volumes" Mar 17 04:39:42 crc kubenswrapper[4735]: I0317 04:39:42.606292 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:39:42 crc kubenswrapper[4735]: I0317 04:39:42.607081 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.159424 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562040-kws8t"] Mar 17 04:40:00 crc kubenswrapper[4735]: E0317 04:40:00.160787 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="extract-utilities" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.160817 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="extract-utilities" Mar 17 04:40:00 crc kubenswrapper[4735]: E0317 04:40:00.160910 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="extract-content" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.160923 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="extract-content" Mar 17 04:40:00 crc kubenswrapper[4735]: E0317 04:40:00.160952 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="registry-server" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.160964 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="registry-server" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.161341 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc7cfcd-7248-4da3-b0d4-20e8f556ffca" containerName="registry-server" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.162431 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.164556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fw8vf" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.167905 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.168168 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.174805 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562040-kws8t"] Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.215688 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44fs\" (UniqueName: \"kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs\") pod \"auto-csr-approver-29562040-kws8t\" (UID: \"e7aae471-e1df-4b2d-8722-341dbd8fcfeb\") " pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.319205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44fs\" (UniqueName: \"kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs\") pod \"auto-csr-approver-29562040-kws8t\" (UID: \"e7aae471-e1df-4b2d-8722-341dbd8fcfeb\") " pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.341589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44fs\" (UniqueName: \"kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs\") pod \"auto-csr-approver-29562040-kws8t\" (UID: \"e7aae471-e1df-4b2d-8722-341dbd8fcfeb\") " pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.485315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:00 crc kubenswrapper[4735]: I0317 04:40:00.829403 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562040-kws8t"] Mar 17 04:40:01 crc kubenswrapper[4735]: I0317 04:40:01.087339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562040-kws8t" event={"ID":"e7aae471-e1df-4b2d-8722-341dbd8fcfeb","Type":"ContainerStarted","Data":"10f203f46d2226bdb674955efab0aeb5389a5342ef718f535a04844c60441471"} Mar 17 04:40:03 crc kubenswrapper[4735]: I0317 04:40:03.121706 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7aae471-e1df-4b2d-8722-341dbd8fcfeb" containerID="69728a9a1ac272bac23d09606cbfd133fc86799f267d88a8f3ec0a275cc5f1ca" exitCode=0 Mar 17 04:40:03 crc kubenswrapper[4735]: I0317 04:40:03.121821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562040-kws8t" event={"ID":"e7aae471-e1df-4b2d-8722-341dbd8fcfeb","Type":"ContainerDied","Data":"69728a9a1ac272bac23d09606cbfd133fc86799f267d88a8f3ec0a275cc5f1ca"} Mar 17 04:40:04 crc kubenswrapper[4735]: I0317 04:40:04.524308 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:04 crc kubenswrapper[4735]: I0317 04:40:04.635097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k44fs\" (UniqueName: \"kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs\") pod \"e7aae471-e1df-4b2d-8722-341dbd8fcfeb\" (UID: \"e7aae471-e1df-4b2d-8722-341dbd8fcfeb\") " Mar 17 04:40:04 crc kubenswrapper[4735]: I0317 04:40:04.643132 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs" (OuterVolumeSpecName: "kube-api-access-k44fs") pod "e7aae471-e1df-4b2d-8722-341dbd8fcfeb" (UID: "e7aae471-e1df-4b2d-8722-341dbd8fcfeb"). InnerVolumeSpecName "kube-api-access-k44fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 04:40:04 crc kubenswrapper[4735]: I0317 04:40:04.738620 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k44fs\" (UniqueName: \"kubernetes.io/projected/e7aae471-e1df-4b2d-8722-341dbd8fcfeb-kube-api-access-k44fs\") on node \"crc\" DevicePath \"\"" Mar 17 04:40:05 crc kubenswrapper[4735]: I0317 04:40:05.157598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562040-kws8t" event={"ID":"e7aae471-e1df-4b2d-8722-341dbd8fcfeb","Type":"ContainerDied","Data":"10f203f46d2226bdb674955efab0aeb5389a5342ef718f535a04844c60441471"} Mar 17 04:40:05 crc kubenswrapper[4735]: I0317 04:40:05.158574 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f203f46d2226bdb674955efab0aeb5389a5342ef718f535a04844c60441471" Mar 17 04:40:05 crc kubenswrapper[4735]: I0317 04:40:05.158181 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562040-kws8t" Mar 17 04:40:05 crc kubenswrapper[4735]: I0317 04:40:05.645999 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562034-g9whz"] Mar 17 04:40:05 crc kubenswrapper[4735]: I0317 04:40:05.651029 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562034-g9whz"] Mar 17 04:40:07 crc kubenswrapper[4735]: I0317 04:40:07.084376 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922fb719-bcea-458f-9d6a-b10825eafaaa" path="/var/lib/kubelet/pods/922fb719-bcea-458f-9d6a-b10825eafaaa/volumes" Mar 17 04:40:12 crc kubenswrapper[4735]: I0317 04:40:12.605959 4735 patch_prober.go:28] interesting pod/machine-config-daemon-z669m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 04:40:12 crc kubenswrapper[4735]: I0317 04:40:12.606343 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 04:40:12 crc kubenswrapper[4735]: I0317 04:40:12.606387 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z669m" Mar 17 04:40:12 crc kubenswrapper[4735]: I0317 04:40:12.608452 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59bdc573ea560fbc2dceb95df3b5c4b658d709ef6478e78c1a741d04d2c90abc"} pod="openshift-machine-config-operator/machine-config-daemon-z669m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 04:40:12 crc kubenswrapper[4735]: I0317 04:40:12.608569 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z669m" podUID="0fe43c53-58a2-4450-a71c-667e10384678" containerName="machine-config-daemon" containerID="cri-o://59bdc573ea560fbc2dceb95df3b5c4b658d709ef6478e78c1a741d04d2c90abc" gracePeriod=600 Mar 17 04:40:13 crc kubenswrapper[4735]: I0317 04:40:13.237359 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fe43c53-58a2-4450-a71c-667e10384678" containerID="59bdc573ea560fbc2dceb95df3b5c4b658d709ef6478e78c1a741d04d2c90abc" exitCode=0 Mar 17 04:40:13 crc kubenswrapper[4735]: I0317 04:40:13.237958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerDied","Data":"59bdc573ea560fbc2dceb95df3b5c4b658d709ef6478e78c1a741d04d2c90abc"} Mar 17 04:40:13 crc kubenswrapper[4735]: I0317 04:40:13.237995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z669m" event={"ID":"0fe43c53-58a2-4450-a71c-667e10384678","Type":"ContainerStarted","Data":"23607daee3b3ada043891c163f5ac76233f4701ac3721365a4146b25b863df51"} Mar 17 04:40:13 crc kubenswrapper[4735]: I0317 04:40:13.238016 4735 scope.go:117] "RemoveContainer" containerID="9965e115ea1c01260aa7359f50a7e164b7bee912166dcbb7d8de6d514134f02a" Mar 17 04:40:14 crc kubenswrapper[4735]: I0317 04:40:14.482699 4735 scope.go:117] "RemoveContainer" containerID="af49b41de559d1acc44d0df22b7bf18385cc847ded95be3072a9bbf1b2c52d3c"